Wednesday, December 22, 2010

DOCTOR WHO'S TRUSTY INVENTION IS ANYTHING BUT SCI-FI

0 comentarios

Television's favourite Time Lord could not exist without his trusty sonic screwdriver, as it's proved priceless in defeating Daleks and keeping the Tardis in check. Now Doctor Who's famous cure-all gadget could become a reality for DIY-ers across the world, say engineers.

Ultrasonic engineers at Bristol University and The Big Bang: UK Young Scientists and Engineers Fair are uncovering how a real life version of the fictional screwdriver - which uses sonic technology to open locks and undo screws - could be created.

Professor of Ultrasonics, Bruce Drinkwater, who is working with The Big Bang to inspire young scientists of the future, says the answer lies in ultrasonic sound waves. By operating the waves at frequencies way beyond the realms of human hearing, they can be used to apply forces to objects.

The technology is already being trialled in modern manufacturing to fix parts together and ultrasonic force fields are being developed within the medical field to separate diseased cells from healthy cells. Professor Drinkwater and The Big Bang team are now exploring whether super powerful versions of these sound beams could bring Doctor Who's iconic device to life.

He says: "Doctor Who is renowned for bending the rules of science. But technology has radically moved on since the Doc first stepped out of his Tardis in the sixties. Whilst a fully functioning time machine may still be light years away, engineers are already experimenting with ultrasonic waves to move and manipulate small objects."

Engineers are looking into how ultrasonic waves can be spun at high speed to create a twisting force similar to that of a miniature tornado, which could undo screws remotely. They have also experimented with rotating ultrasonic force fields which would act like the head of a real screwdriver.

Doctor Who and DIY fans may still have to wait before they can add the sonic screwdriver to their Christmas wish lists. However, Professor Drinkwater hopes his work to make the impossible possible will inspire engineers, technologists and inventors of the future.

"Doctor Who's adventures have captured the imaginations of millions, young and old. And, however far fetched the Time Lord's encounters may seem, there are engineers and scientists out there who are using their skills to bring the magic to life.

"The sonic screwdriver may still be sometime in the making but ultrasonic technology is already making its mark in the medical and manufacturing arenas with some exciting results."

(Photo: Nick Riddle)

Bristol University

NEW BLOOD TEST COULD DETECT HEART DISEASE IN PEOPLE WITH NO SYMPTOMS

0 comentarios

A more sensitive version of a blood test typically used to confirm that someone is having a heart attack could indicate whether a seemingly healthy, middle-aged person has unrecognized heart disease and an increased risk of dying, UT Southwestern Medical Center researchers have found.

In a study available online and in the Dec. 8 print issue of the Journal of the American Medical Association, researchers found that a new, highly sensitive test for a protein called cardiac troponin T (cTnT) could detect the protein in about 25 percent of blood samples supplied by more than 3,500 individuals. The study also found that people with detectable levels of troponin T were nearly seven times more likely to die within six years from heart disease.

“This test is among the most powerful predictors of death in the general population we’ve seen so far,” said Dr. James de Lemos, associate professor of internal medicine at UT Southwestern and lead author of the study. “It appears that the higher your troponin T, the more likely you are to have problems with your heart, and the worse you’re going to do, regardless of your other risk factors.”

Although previous work has shown an association between cTnT levels and heart disease, standard tests for the protein can detect cTnT in only a very small percentage of the population, limiting the test’s utility for assessing risk in people with no symptoms of heart disease.

The more sensitive test, however, can detect circulating cTnT levels in almost everyone with chronic heart failure and chronic coronary artery disease.

“Because this test seems to identify cardiovascular problems that were previously unrecognized, we hope in the future to be able to use it to prevent some death and disability from heart failure and other cardiac diseases,” Dr. de Lemos said.

Emergency room doctors commonly use the standard, less sensitive test for cTnT to determine whether a patient experiencing chest pains is having a heart attack. Dr. de Lemos said the ability to detect lower levels of the protein could make emergency room physicians rethink the interpretation of the cTnT level.

“With the new highly sensitive assays, it’s going to be much more difficult to determine if elevated levels of troponin T are due to a heart attack or rather another chronic form of disease,” he said.

The current work with cTnT built on previous findings by Dr. de Lemos from the Dallas Heart Study, a groundbreaking investigation of cardiovascular disease that first involved more than 6,100 Dallas County residents. As part of that study, researchers found that cTnT could be detected with the standard technology in 1 percent of the population.

To determine if newer, more sensitive technology could detect cTnT at lower levels, researchers used the same population of residents. Starting in the year 2000, more than 3,500 participants provided blood samples and underwent multiple body scans with magnetic resonance imaging and computed tomography to examine the heart and other organs. Researchers then tracked the cause and time of death of participants, ages 30 to 65, through 2007.

“This study was designed to be representative of urban communities throughout the United States where there is a high prevalence of obesity, untreated hypertension and diabetes – just as there is in Dallas,” Dr. de Lemos said.

Older adults, males, African-Americans and individuals with abnormal thickening or weakness of the heart muscles had the highest levels of cTnT.

The outcomes were validated in a companion paper published in the same print issue of JAMA. The second study, co-authored by Dr. de Lemos and led by Dr. Christopher deFilippi of the University of Maryland School of Medicine, also used the highly sensitive test, but only in participants older than 65. That study found that in addition to death, cTnT was associated with heart failure, and that the risk of both outcomes shifted in concordance with change in cardiac troponin T levels over time.

(Photo: UTSMC)

UT Southwestern Medical Center

GREENLAND ICE SHEET FLOW DRIVEN BY SHORT-TERM WEATHER EXTREMES, NOT GRADUAL WARMING

0 comentarios
Sudden changes in the volume of meltwater contribute more to the acceleration – and eventual loss – of the Greenland ice sheet than the gradual increase of temperature, according to a University of British Columbia study.

The ice sheet consists of layers of compressed snow and covers roughly 80 per cent of the surface of Greenland. Since the 1990s, it has been documented to be losing approximately 100 billion tonnes of ice per year – a process that most scientists agree is accelerating, but has been poorly understood. Some of the loss has been attributed to accelerated glacier flow towards ocean outlets.

Now a new study, to be published tomorrow in the journal Nature, shows that a steady meltwater supply from gradual warming may in fact slow down glacier flow, while sudden water input could cause glaciers to speed up and spread, resulting in increased melt.

“The conventional view has been that meltwater permeates the ice from the surface and pools under the base of the ice sheet,” says Christian Schoof, an assistant professor at UBC’s Department of Earth and Ocean Sciences and the study’s author. “This water then serves as a lubricant between the glacier and the earth underneath it, allowing the glacier to shift to lower, warmer altitudes where more melt would occur.”

Noting observations that during heavy rainfall, higher water pressure is required to force drainage along the base of the ice, Schoof created computer models that account for the complex fluid dynamics occurring at the interface of glacier and bedrock. He found that a steady supply of meltwater is well accommodated and drained through water channels that form under the glacier.

“Sudden water input caused by short term extremes – such as massive rain storms or the draining of a surface lake – however, cannot easily be accommodated by existing channels. This allows it to pool and lubricate the bottom of the glaciers and accelerate ice loss,” says Schoof, who holds a Canada Research Chair in Global Process Modeling.

“This certainly doesn’t mitigate the issue of global warming, but it does mean that we need to expand our understanding of what’s behind the massive ice loss we’re worried about,” says Schoof.

A steady increase of temperature and short-term extreme weather conditions have both been attributed to global climate change. According to the European Environment Agency, ice loss from the Greenland ice sheet has contributed to global sea-level rise at 0.14 to 0.28 millimetres per year between 1993 and 2003.

“This study provides an elegant solution to one of the two key ice sheet instability problems identified by the Intergovernmental Panel on Climate Change in their 2007 assessment report,” says Prof. Andrew Shepherd, an expert on using satellites to study physical processes of Earth’s climate, based at the University of Leeds, the U.K.

“It turns out that, contrary to popular belief, Greenland ice sheet flow might not be accelerated by increased melting after all,” says Shepherd, who was not involved in the research or peer review of the paper.

University of British Columbia

LIFE THRIVES IN POROUS ROCK DEEP BENEATH THE SEAFLOOR

0 comentarios

Researchers have found compelling evidence for an extensive biological community living in porous rock deep beneath the seafloor. The microbes in this hidden world appear to be an important source of dissolved organic matter in deep ocean water, a finding that could dramatically change ideas about the ocean carbon cycle.

Matthew McCarthy, associate professor of ocean sciences at the University of California, Santa Cruz, led a team of researchers from several institutions who analyzed the dissolved organic matter in fluids from natural vents on the seafloor and from a borehole that penetrated the basement rock deep beneath the seafloor sediments. Their results, to be published in the January issue of Nature Geoscience and currently available online, indicate that the dissolved material in those fluids was synthesized by microbes living in the porous basalt rock of the upper oceanic crust. These microbes are "chemoautotrophic," meaning they derive energy from chemical reactions and are completely independent of the sunlight-driven life on the surface of our planet.

Chemoautotrophic microbes (bacteria and archaea) have been found in deep-ocean sediments and at hydrothermal vents, where hot water flows out through newly formed volcanic rock at mid-ocean ridges. The idea that a much larger biological community might exist in habitats within the cooler upper-crustal rock that lies under large areas of the seafloor has been an exciting, but controversial, hypothesis, McCarthy said.

"What is really important about this is the huge size and extent of such systems," he said. "This study provides the strongest evidence yet that a really large biosphere exists in the warm fluids in the porous upper-oceanic crust. It's large not just in area, but in productivity. In the same way that forests and grasslands fix carbon and produce organic matter on land, our data suggest these microbes produce enough organic matter to export carbon to other systems. That's a real expansion of our ideas about the oceanic carbon cycle."

The existence of an extensive "alternate biosphere" beneath the ocean floor may also influence the thinking of astrobiologists about where life might exist elsewhere in our solar system, McCarthy said. Saturn's moon Europa, for example, is thought to have liquid oceans beneath its icy crust, prompting speculation about the possibility of life evolving there.

McCarthy's team found evidence of the hidden microbial ecosystem beneath the seafloor by analyzing carbon isotopes in the organic molecules in their samples. Of the three naturally occurring isotopes of carbon, carbon-12 is the most abundant, and both carbon-12 and the slightly heavier carbon-13 are stable. Carbon-14 is an unstable isotope formed in the upper atmosphere through the action of cosmic rays, and its steady decay is the basis for carbon-dating of organic material.

The ratios of these different isotopes provide telltale clues to the origins of organic molecules and the carbon atoms in them. Carbon-13 analysis, for example, indicates what kind of organisms synthesized the molecules. "Carbon-13 is really useful for looking at the origins of organic matter, because there are distinctive signatures for different sources," McCarthy said. "Chemosynthetic bacteria have wildly different signatures than anything else, and our carbon-13 results match the classic chemosynthetic values."

The team's carbon-14 analysis showed where the carbon in the organic molecules came from. If it came from the carbon in crustal rocks, there would be no carbon-14 at all. Instead, the carbon-14 signature indicated that the carbon came from dissolved inorganic carbon in deep seawater. This inorganic carbon pool consists of carbonate ions formed when carbon dioxide from the atmosphere dissolves in ocean water.

Carbon-14 dating indicated that the carbon in the dissolved organic matter is 11,800 to 14,400 years old--in other words, that's how long ago the carbon now in those organic molecules was absorbed from the atmosphere into the ocean. That's about three times older than the carbon-14 age of the overall pool of dissolved organic matter in the deep ocean. This suggests that water circulates very slowly through the deep microbial habitat in the rocks of the upper crust.

"The observation that this deep biosphere is apparently pumping very old, carbon-14-depleted dissolved organic matter into the deep ocean may be very important to our understanding of biogeochemical cycles," McCarthy said. "The reservoir of dissolved organic matter in the deep ocean is one of the largest active pools of organic carbon in the global carbon cycle, about the same size as the pool of atmospheric carbon dioxide."

The age of the deep-ocean water is used to estimate how quickly it turns over and returns to the surface layers. "If this very old pool of carbon is being mixed in and biasing the measurements, the deep-ocean water may actually be turning over more quickly than we thought," McCarthy said.

To obtain their samples, the researchers used custom-built equipment and a remotely operated deep-sea submersible, the ROV Jason II, from Woods Hole Oceanographic Institution (WHOI). Stainless-steel probes driven into an exposed rock outcrop and a specialized set of deep-sea sampling platforms designed at the University of Washington (UW) enabled them to recover the unprecedented quantities of uncontaminated crustal fluids needed for the analyses. The samples were collected during two expeditions to the Juan de Fuca Ridge system off the coast of Washington and British Columbia.

(Photo: UCSC)

University of California, Santa Cruz

UCSF TEAM DEVELOPS LOGIC GATES TO PROGRAM BACTERIA AS COMPUTERS

0 comentarios

A team of UCSF researchers has engineered E. coli with the key molecular circuitry that will enable genetic engineers to program cells to communicate and perform computations.

The work builds into cells the same logic gates found in electronic computers and creates a method to create circuits by “rewiring” communications between cells. This system can be harnessed to turn cells into miniature computers, according to findings that will be reported in an upcoming issue of Nature and appear today in the advanced online edition at www.nature.com.

That, in turn, will enable cells to be programmed with more intricate functions for a variety of purposes, including agriculture and the production of pharmaceuticals, materials and industrial chemicals, according to Christopher A. Voigt, PhD, a synthetic biologist and associate professor in the UCSF School of Pharmacy’s Department of Pharmaceutical Chemistry who is senior author of the paper.

The most common electronic computers are digital, he explained; that is, they apply logic operations to streams of 1’s and 0’s to produce more complex functions, ultimately producing the software with which most people are familiar. These logic operations are the basis for cellular computation, as well.

“We think of electronic currents as doing computation, but any substrate can act like a computer, including gears, pipes of water, and cells,” Voigt said. “Here, we’ve taken a colony of bacteria that are receiving two chemical signals from their neighbors, and have created the same logic gates that form the basis of silicon computing.”

Applying this to biology will enable researchers to move beyond trying to understand how the myriad parts of cells work at the molecular level, to actually use those cells to perform targeted functions, according to Mary Anne Koda-Kimble, dean of the UCSF School of Pharmacy.

“This field will be transformative in how we harness biology for biomedical advances,” said Koda-Kimble, who championed Voigt’s recruitment to lead this field at UCSF in 2003. “It’s an amazing and exciting relationship to watch cellular systems and synthetic biology unfold before our eyes.”

The Nature paper describes how the Voigt team built simple logic gates out of genes and inserted them into separate E. coli strains. The gate controls the release and sensing of a chemical signal, which allows the gates to be connected among bacteria much the way electrical gates would be on a circuit board.

“The purpose of programming cells is not to have them overtake electronic computers,” explained Voigt, whom Scientist magazine named a “scientist to watch” in 2007 and whose work is included among the Scientist’s Top 10 Innovations of 2009. “Rather, it is to be able to access all of the things that biology can do in a reliable, programmable way.”

The research already has formed the basis of an industry partnership with Life Technologies, in Carlsbad, Cal., in which the genetic circuits and design algorithms developed at UCSF will be integrated into a professional software package as a tool for genetic engineers, much as computer-aided design is used in architecture and the development of advanced computer chips.

The automation of these complex operations and design choices will advance basic and applied research in synthetic biology. In the future, Voigt said the goal is to be able to program cells using a formal language that is similar to the programming languages currently used to write computer code.

(Photo: UCSF)

UCSF

KECK OBSERVATORY PICTURES SHOW FOURTH PLANET IN GIANT SOLAR SYSTEM

0 comentarios

Astronomers announced the discovery of a fourth giant planet joining three others orbiting a nearby star with information that challenges our current understanding of planet formation. The dusty young star named HR8799, located 129 light years away, was first recognized in 2008 when the research team presented the first-ever images of a planetary system orbiting a star other than our sun.

Now, a research team from Lawrence Livermore National Laboratory (LLNL), National Research Council of Canada (NRC), the University of California Los Angeles, (UCLA) and Lowell Observatory has discovered a fourth planet that is about 7 times the mass of Jupiter – similar to the other three. Using high-contrast, near infrared adaptive optics on the Keck II telescope in Hawaii, the astronomers imaged the fourth planet (dubbed HR8799e) in 2009 and confirmed its existence and orbit in 2010. The research appears in the Dec. 8 edition of the journal, Nature.

“The images of this new inner planet in the system is the culmination of 10 years worth of innovations, making steady progress to optimize every observation and analysis step to allow the detection of planets located ever closer to their stars,” said Christian Marois, a former LLNL postdoc now at NRC, and first author of the new paper.

If this newly discovered planet was located in orbit around our sun, it would lie between Saturn and Uranus. This giant version of our solar system is young at about 30 million years old compared to our system, which is about 4.6 billion years old.

Though the system is very much like our own, in other ways, it is much more extreme than our own – the combined mass of the four giant planets may be 20 times higher, and the asteroid and comet belts are dense and turbulent. In fact, the massive planets’ pull on each other gravitationally, and the system may be on the verge of falling apart.

This team of scientists simulated millions of years of evolution of the system, and showed that to have survived this long, the three inner planets may have to orbit like clockwork, with the new planet going around the star exactly four times while the second planet finishes two orbits in the time it takes the outer planet to complete one. This behavior was first seen in the moons of Jupiter but has never before been seen on this scale.

Studying the planet’s orbits also will help estimate their masses. “Our simulations show that if the objects were not planets, but supermassive ‘brown dwarfs’, the system would have fallen apart already,” said Quinn Konopacky, a postdoctoral researcher at LLNL’s Institute of Geophysics and Planetary Physics and a key author of the paper. “The implication is that we have truly found a unique new system of planets.” (Brown dwarfs are “failed stars”, too low in mass to sustain stable hydrogen fusion but larger than planets.) “We don’t yet know if the system will last for billions of years, or fall apart in a few million more. As astronomers carefully follow the HR 8799 planets during the coming decades, the question of just how stable their orbits are could become much clearer.”

The origin of these four giant planets remains a puzzle. It neither follows the “core accretion” model, in which planets form gradually close to stars where the dust and gas are thick or the “disk fragmentation” model in which a turbulent planet-forming disk rapidly cools and collapses out at its edges. Bruce Macintosh, a senior scientist at LLNL and the principal investigator for the Keck Observatory program, said: “There’s no simple model that can make all four planets at their current location. It’s a challenge for our theoretical colleagues.”

Previous observations had shown evidence for a dusty asteroid belt orbiting closer to the star – the new planet’s gravity helps account for the location of those asteroids, confining their orbits just like Jupiter does in our solar system. “Besides having four giant planets, both systems contain also two so-called “debris belts” composed of small rocky and/or icy objects along with lots of tiny dust particles, similar to the asteroid and Kuiper comet belts of our solar system”, noted co-author Ben Zuckerman, a professor of physics and astronomy at UCLA.

“Images like these bring the exoplanet field into the era of characterization. Astronomers can now directly examine the atmospheric properties of four giant planets orbiting another star that are all the same young age and that formed from the same building materials.” said Travis Barman, a Lowell Observatory exoplanet theorist and co-author of the current paper.

“I think there’s a very high probability that there are more planets in the system that we can’t detect yet,” Macintosh said. “One of the things that distinguishes this system from most of the extrasolar planets that are already known is that HR8799 has its giant planets in the outer parts - like our solar system does - and so has ‘room’ for smaller terrestrial planets – far beyond our current ability to see – in the inner parts.”

“It’s amazing how far we’ve come in a few years,” Macintosh said. “In 2007, when we first saw the system, we could barely see two planets out past the equivalent of Pluto’s orbit. Now we’re imaging a fourth planet almost where Saturn is on our solar system. It’s another step to the ultimate goal – still more than a decade away – of a picture showing another planet like Earth.”

(Photo: NRC-HIA, Christian Marois, and the W.M. Keck Observatory)

W.M. Keck Observatory

UCSB SCIENTISTS REPORT STUDY OF BRAIN MAPS' FOR HOW HUMANS REACH

0 comentarios

A ballet dancer grasps her partner's hand to connect for a pas de deux. Later that night, in the dark, she reaches for her calf to massage a sore spot. Her brain is using different "maps" to plan for each of these movements, according to a new study at UC Santa Barbara.

In preparing for each of these reaching movements, the same part of the dancer's brain is activated, but it uses a different map to specify the action, according to the research. Planning to hold hands is based on her visual map of space. Her second plan, to reach for her calf, depends on the dancer's mental body map.

Two UCSB scientists studied the brains of 18 individuals who made 400 distinct arm reaches as they lay in an MRI scanner. The researchers found clear differences in brain planning activity with regard to the two types of reaching behavior. Their discovery is reported in the journal Neuron.

"Our results have two important applications," said Scott T. Grafton, professor of psychology. "One is robotics. The other is in the area of machine-brain interface; for example, in developing machines to help paraplegics. A critical issue is to understand how movement-related information is represented in the brain if we're to decode it." Grafton, a leading expert in brain imaging, directs the UCSB Brain Imaging Center where the university's MRI scanner is located.

"We're interested in movement planning and movement control," said Grafton. "We're looking at goal-directed behaviors, when we reach to grasp objects –– visually defined objects in our environment. This forms the basis of our interactions with the world."
The current scientific view is that all reaching movements –– those directed to visual targets or toward one's own body –– are planned using a visual map. "Our findings suggest otherwise," said Pierre-Michel Bernier, first author and postdoctoral fellow. "We found that when a target is visual, the posterior parietal cortex is activated, coding the movement using a visual map. However, if a movement is performed in darkness and the target is non-visual, the same brain region will use a fundamentally different map to plan the movement. It will use a body map."

The maps are located in a brain region called the precuneus, inside the parietal lobe. The researchers measured the "Blood Oxygen Level Dependent Signal," or BOLD signal, when looking at the MRI brain images. BOLD is an indirect way of looking at brain activity at a millimeter scale. They also used a methodology called "repetition suppression." This is what makes the study novel, according to the authors, as it is one of the first to identify where these maps are nested in the human brain. "We are a leader in the use of repetition suppression," said Grafton.

Repetition suppression relies on the fact that when a brain region is involved in two similar activities in a row, it is less active the second time around. The team was able to pinpoint the brain's use of body maps versus visual maps by isolating the location in the brain where the responses were less active with repeated, similar arm reaches.

Grafton explained: "The brain is trying to make a map of the world. One map is what you see, which is provided by the visual system. The other map is where the body is in space. This map is based on proprioception –– the sense of limb position –– which is derived from receptors in the skin, muscles, and joints. These maps are very different. How do you connect them? Either the visual map or the body map may be fixed, or neither may be fixed."

The authors' findings argue for the latter, demonstrating that the brain is capable of flexibly switching between these maps depending on the context. No doubt this flexibility underlies our ability to interact with the world with ease despite the ever-changing conditions in which our actions take place.

(Photo: George Foulsham, Office of Public Affairs, UCSB)

University of California , Santa Barbara

BACTERIA SEEK TO TOPPLE THE EGG AS TOP FLU VACCINE TOOL

0 comentarios

Only the fragile chicken egg stands between Americans and a flu pandemic that would claim tens of thousands more lives than are usually lost to the flu each year.

Vaccine production hinges on the availability of hundreds of millions of eggs – and even with the vaccine, flu still claims somewhere around 36,000 lives in the United States during a typical year. Now scientists have taken an important step toward ending the dominance of the oval. In a paper published in the Dec. 6 issue of the journal Vaccine, scientists showed that an experimental flu vaccine grown entirely in bacteria – a process that bypasses the egg completely – works well in people, triggering an immune response that would protect them against the flu.

The study of 128 healthy people ages 18 to 49 at the University of Rochester Medical Center was led by John Treanor, M.D., an expert on flu vaccines who has helped lead efforts to create and test new ways to make flu vaccine more quickly and less expensively. The vaccine – which is free of bacteria itself – is made by New Jersey-based VaxInnate Inc., which funded the study.

“There are a number of problems with using eggs to produce flu vaccine,” said Treanor. “It’s a very specialized product. It’s hard to make more eggs in a hurry – you only get them as fast as hens lay them. They’re not easy to manipulate, and it can be challenging to get the flu virus to grow within an egg. The flu vaccine system would be more flexible and reliable if we didn’t have to rely on them.”

Scientists have been exploring a number of alternatives to eggs – creating doses to cover just the U.S. population requires millions of eggs that, if laid end to end, would just about encircle the continental United States.

Bacteria have not been high on the list of options, even though they have the capability of producing vaccine more quickly and less expensively than many other methods. Most efforts to use bacteria have faltered due to basic differences in the way that bacteria process proteins compared to more complex eukaryotic cells, which have a nucleus. Proteins are a crucial component of flu vaccine, and keeping the key proteins folded correctly has been a challenge in bacteria, which lack cellular machinery critical to the process.

“It was long accepted as dogma that you could not make a flu vaccine in bacteria that could stimulate a protective immune response in humans,” said Treanor. “But in this vaccine, the surface flu protein hemagglutinin was made by E.coli in such a way that it folded correctly, stimulating an authentic immune response. It’s almost surprising that this is possible.”

VaxInnate addressed the problem by focusing on just one small key protein of hemagglutinin that can be correctly refolded after synthesis in bacteria. The small protein is enough to spur the immune system because it was attached to an adjuvant – a compound designed to strengthen the vaccine by stimulating a more robust immune response. Adjuvants currently are not part of U.S. flu vaccines, though they are used in other countries and as parts of other vaccines. Usually, adjuvants are simply mixed into a vaccine, but the latest work offers a new method. A bacterial protein called flagellin was actually fused to a molecule that mimics the flu’s hemagglutinin protein – a combination designed both to draw the attention of the immune system and immediately amplify it in one step.
The amount of material in the experimental flu shot under study is just a fraction of the amount used in a licensed flu shot. The most successful tests were done with one or two micrograms of vaccine, much smaller than today’s licensed 15-microgram shot. About half of participants got a strong immune response at 1 microgram, and about 80 percent got a strong immune response at 2 micrograms.

(Photo: U. Rochester)

University of Rochester

SCIENTISTS DISCOVER BRAIN’S INHERENT ABILITY TO FOCUS LEARNING

0 comentarios

Medical researchers have found a missing link that explains the interaction between brain state and the neural triggers responsible for learning, potentially opening up new ways of boosting cognitive function in the face of diseases such as Alzheimer’s as well as enhancing memory in healthy people.

Much is known about the neural processes that occur during learning but until now it has not been clear why it occurs during certain brain states but not others. Now researchers from the University of Bristol have been able to study, in isolation, the specific neurotransmitter which enhances learning and memory.

Acetylcholine is released in the brain during learning and is critical for the acquisition of new memories. Its role is to facilitate the activity of NMDA receptors, proteins that control the strength of connections between nerve cells in the brain.

Currently, the only effective treatment for the symptoms of cognitive impairment seen in diseases such as Alzheimer’s is through the use of drugs that boost the amount of acetylcholine release and thereby enhance cognitive function.

Describing their findings in the journal Neuron, researchers from Bristol’s School of Physiology and Pharmacology have shown that acetylcholine facilitates NMDA receptors by inhibiting the activity of other proteins called SK channels whose normal role is to restrict the activity of NMDA receptors.

This discovery of a role for SK channels provides new insight into the mechanisms underlying learning and memory. SK channels normally act as a barrier to NMDA receptor function, inhibiting changes in the strength of connections between nerve cells and therefore restricting the brain’s ability to encode memories. Findings from this latest research show that the SK channel barrier can be removed by the release of acetylcholine in the brain in order to enhance our ability to learn and remember information.

Lead researcher Dr Jack Mellor, from the University of Bristol’s Medical School, said: “These findings are not going to revolutionise the treatment of Alzheimer’s disease or other forms of cognitive impairment overnight. However, national and international funding bodies have recently made research into aging and dementia a top priority so we expect many more advances in our understanding of the mechanisms underlying learning and memory in both health and disease.”

The team studied the effects of drugs that target acetylcholine receptors and SK channels on the strength of connections between nerve cells in animal brain tissue. They found that changes in connection strength were facilitated by the presence of drugs that activate acetylcholine receptors or block SK channels revealing the link between the two proteins.

Dr Mellor added: “From a therapeutic point of view, this study suggests that certain drugs that act on specific acetylcholine receptors may be highly attractive as potential treatments for cognitive disorders. Currently, the only effective treatments for patients with Alzheimer’s disease are drugs that boost the effectiveness of naturally released acetylcholine. We have shown that mimicking the effect of acetylcholine at specific receptors facilitates changes in the strength of connections between nerve cells. This could potentially be beneficial for patients suffering from Alzheimer’s disease or schizophrenia.”

(Photo: Bristol U.)

University of Bristol

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com