Tuesday, December 7, 2010

JET LAGGED AND FORGETFUL? IT'S NO COINCIDENCE

0 comentarios

Chronic jet lag alters the brain in ways that cause memory and learning problems long after one's return to a regular 24-hour schedule, according to research by University of California, Berkeley, psychologists.

Twice a week for four weeks, the researchers subjected female Syrian hamsters to six-hour time shifts – the equivalent of a New York-to-Paris airplane flight. During the last two weeks of jet lag and a month after recovery from it, the hamsters' performance on learning and memory tasks was measured.

As expected, during the jet lag period, the hamsters had trouble learning simple tasks that the hamsters in the control group aced. What surprised the researchers was that these deficits persisted for a month after the hamsters returned to a regular day-night schedule.

What's more, the researchers discovered persistent changes in the brain, specifically within the hippocampus, a part of the brain that plays an intricate role in memory processing. They found that, compared to the hamsters in the control group, the jet-lagged hamsters had only half the number of new neurons in the hippocampus following the month long exposure to jet lag. New neurons are constantly being added to the adult hippocampus and are thought to be important for hippocampal-dependent learning, Kriegsfeld said, while memory problems are associated with a drop in cell maturation in this brain structure.

"This is the first time anyone has done a controlled trial of the effects of jet lag on brain and memory function, and not only do we find that cognitive function is impaired during the jet lag, but we see an impact up to a month afterward," said Lance Kriegsfeld, UC Berkeley associate professor of psychology and a member of the Helen Wills Neuroscience Institute. "What this says is that, whether you are a flight attendant, medical resident, or rotating shift worker, repeated disruption of circadian rhythms is likely going to have a long-term impact on your cognitive behavior and function."

Kriegsfeld, graduate student Erin M. Gibson and their colleagues reported their findings this week in the online, open-access journal PLoS ONE.

"Other studies have shown that chronic transmeridian flights increase deficits in memory and learning along with atrophy in the brain's temporal lobe, suggesting a possible hippocampal deficit," said Gibson. "Our study shows directly that jet lag decreases neurogenesis in the hippocampus."

Jet lag is a result of crossing several time zones in a short period of time, with the worst effects occurring during eastward travel. Each of us has an internal, 24-hour clock that drives our so-called circadian rhythm, which is reset every day by small amounts. When a person enters a time zone that is not synched with his or her internal clock, it takes much longer to reset this daily rhythm, causing jet lag until the internal clock gets re-synched.

This acute disruption of circadian rhythms can cause general malaise as well as gastrointestinal problems because the body's hunger cycle is out of sync with meal times, Kriegsfeld said.

For air travelers, jet lag is a minor annoyance from which most recover within a few days, perhaps with the help of a melatonin pill. For people who repeatedly cross time zones, such as flight attendants, the effects have been shown to be more serious. Flight attendants and rotating shift workers – people who regularly alternate between day and night shifts – have been found to have learning and memory problems, decreased reaction times, higher incidences of diabetes, heart disease, hypertension and cancer, and reduced fertility. The World Health Organization lists shift work as a carcinogen.

To date, these effects have been documented only in jet-lagged subjects, not after recovery from jet lag, Gibson said. The UC Berkeley study is the first to look at long-term effects as well as changes in brain anatomy.

"The evidence is overwhelming that disruptions in circadian timing have a direct impact on human health and disease," Kriegsfeld said. "We've now shown that the effects are long-lasting, not only to brain function, but likely to brain structure."

The researchers used hamsters in their study because they are a classic model of circadian rhythms. Their bodily rhythms are so precise, Kriegsfeld said, that they will produce eggs, or ovulate, every 96 hours to within a window of a few minutes.

Because jet lag can increase stress hormones like cortisol and disrupt reproduction, the researchers controlled for the effects of these by removing adrenal glands or ovaries in some of the hamsters and injecting normal levels of hormone supplements of corticosterone and estrogen, respectively. These hamsters showed a similar reduction in new, mature hippocampal neurons in the brain.

"The change was really dramatic and shows that the effect on behavior and the brain is direct, not a secondary effect of increased stress hormones," Gibson said. "They are not due to increased cortisol concentrations."

The experiments also suggest that the low number of mature neurons in the hippocampus in jet-lagged hamsters was not due to decreased production of new cells, but rather, fewer new cells maturing into working cells, or perhaps new cells dying prematurely. Further studies are planned to determine the root cause of the reduction in mature neurons.

How do you avoid jet lag problems? Kriegsfeld said that, in general, people should allow one day of recovery for every one-hour time zone shift. Those, such as night-shift workers, who cannot return to a normal day-night cycle should sleep in a room with light-tight curtains shielded from outside noise in order to properly adjust to an altered sleep schedule.

(Photo: UC Berkeley)

University of California, Berkeley

CAN CACTI 'ESCAPE' UNDERGROUND IN HIGH TEMPERATURES?

0 comentarios

In the scorching summer heat of the Chihuahuan Desert in southwest Texas, air temperatures can hover around 97°F (36°C) while at the surface of the soil temperatures can exceed 158°F (70°C). Encountering these extreme temperatures, plants must utilize creative methods to not only survive but thrive under these difficult and potentially lethal conditions.

This new work by Dr. Gretchen North and colleagues, published in the December issue of American Journal of Botany (http://www.amjbot.org/cgi/reprint/ajb.1000286v1), sheds light on how one desert resident, the cactus Ariocarpus fissuratus, copes with the effects of high temperatures.

"One crucial point is that small desert plants such as the 'living rock' cactus occupy one of the hottest habitats on earth, the surface of desert soils" stated North.

Ariocarpus fissuratus earned its nickname "living rock" because it blends into the rocky surroundings with its small stature that is level with the soil's surface. The researchers hypothesized that the cactus could "escape" high temperatures by moving more of itself below the soil surface where it is cooler.

Measuring changes in plant depth and root anatomy, North and co-workers determined that the cactus moves deeper into the soil by contracting its roots. But does root contraction play a protective role by modulating temperatures?

To find out, the researchers mimicked summer desert conditions by growing plants on a rooftop in Los Angeles, where air temperature was above 99°F for several days. All the cacti were grown in sandy soil, but half had rocks covering the surface of the soil, similar to their native habitats. For plants grown in rocky soils, the internal temperature of the stem was about 39°F lower than those grown in sandy soils alone. While this may seem like a small decrease, it had a significant effect on the health of the plants.

Unlike the cacti grown in sandy soil which all died, those grown in rocky soil survived the intense heat. Root contraction aided in lowering the internal stem temperature, but only when combined with the cooling effects of the rocky surface. The opposite was true in sandy soil where cacti planted higher above the surface had slightly lower stem temperatures than those planted close to the surface.

"Even in rocky soil, experimental plants attained nearly lethal temperatures during a summer heat wave in Los Angeles" said North. "Thus, root contraction and rocky microhabitats may not provide enough protection should desert temperatures get much higher due to global warming.

(Photo: Gretchen B. North, Occidental College, Los Angeles)

American Journal of Botany

MASSIVE GALAXIES FORMED WHEN UNIVERSE WAS YOUNG

0 comentarios
Some of the universe's most massive galaxies may have formed billions of years earlier than current scientific models predict, according to surprising new research led by Tufts University. The findings appear in the Astrophysical Journal published online Nov. 24 in advance of print publication on Dec. 10, 2010.

"We have found a relatively large number of very massive, highly luminous galaxies that existed almost 12 billion years ago when the universe was still very young, about 1.5 billion years old. These results appear to disagree with the latest predictions from models of galaxy formation and evolution," said Tufts astrophysicist Danilo Marchesini, lead author on the paper and assistant professor of physics and astronomy at the Tufts School of Arts and Sciences. "Current understanding of the physical processes responsible in forming such massive galaxies has difficulty reproducing these observations."

Collaborating with Marchesini were researchers from Yale University, Carnegie Observatories, Leiden University, Princeton University, the University of Kansas and the University of California-Santa Cruz.

The newly identified galaxies were five to ten times more massive than our own Milky Way. They were among a sample studied at redshift 3≤z<4, when the universe was between 1.5 and 2 billion years old.

Redshift refers to the phenomenon of a light wave stretching and moving toward longer wavelengths (the red end of the spectrum) as the emitting object travels away from an observer (Doppler Effect). This is similar to the pitch of a siren getting lower as the siren moves away. The redshift of distant galaxies is due to the expansion of the universe. The larger the redshift, the more distant the galaxy is, or the farther back in time we are observing. The larger the redshift, the younger the universe in which the galaxy is observed.

By complementing existing data with deep images obtained through a new system of five customized near-infrared filters, the researchers were able to get a more complete view of the galaxy population at this early stage and more accurately characterize the sampled galaxies.

The researchers made another surprising discovery: More than 80 percent of these massive galaxies show very high infrared luminosities, which indicate that these galaxies are extremely active and most likely in a phase of intense growth. Massive galaxies in the local universe are instead quiescent and do not form stars at all.

The researchers note that there are two likely causes of such luminosity: New stars may be forming in dust-enshrouded bursts at rates of a few thousand solar masses per year. This would be tens to several hundreds of times greater than the rates estimated by spectral energy distribution (SED) modeling. Alternatively, the high infrared luminosity could be due to highly-obscured active galactic nuclei (AGN) ferociously accreting matter onto rapidly growing super-massive black holes at the galaxies' centers.

There might be an explanation that would at least partially reconcile observations with model-predicted densities. The redshifts of these massive galaxies, and hence their distances, were determined from the SED modeling and have not yet been confirmed spectroscopically. Redshift measurements from SED modeling are inherently less accurate than spectroscopy. Such "systemic uncertainties" in the determination of the distances of these galaxies might still allow for approximate agreement between observations and model predictions.

If half of the massive galaxies are assumed to be slightly closer, at redshift z=2.6, when the universe was a bit older (2.5 billion years old) and very dusty (with dust absorbing much of the light emitted at ultra-violet and optical wavelengths), then the disagreement between observations and model predictions becomes only marginally significant.

However, the discovery of the existence of such massive, old and very dusty galaxies at redshift z=2.6 would itself be a notable discovery. Such a galaxy population has never before been observed.

"Either way, it is clear that our understanding of how massive galaxies form is still far from satisfactory," said Marchesini.

"The existence of these galaxies so early in the history of the universe, as well as their properties, can provide very important clues on how galaxies formed and evolved shortly after the Big Bang," he added.

Tutfs University

METHANE-POWERED LAPTOPS MAY BE CLOSER THAN YOU THINK

0 comentarios

Making fuel cells practical and affordable will not happen overnight. It may, however, not take much longer.

With advances in nanostructured devices, lower operating temperatures, and the use of an abundant fuel source and cheaper materials, a group of researchers led by Shriram Ramanathan at the Harvard School of Engineering and Applied Sciences (SEAS) are increasingly optimistic about the commercial viability of the technology.

Ramanathan, an expert and innovator in the development of solid-oxide fuel cells (SOFCs), says they may, in fact, soon become the go-to technology for those on the go.

Electrochemical fuel cells have long been viewed as a potential eco-friendly alternative to fossil fuels—especially as most SOFCs leave behind little more than water as waste.

The obstacles to using SOFCs to charge laptops and phones or drive the next generation of cars and trucks have remained reliability, temperature, and cost.

Fuel cells operate by converting chemical energy (from hydrogen or a hydrocarbon fuel such as methane) into an electric current. Oxygen ions travel from the cathode through the electrolyte toward the anode, where they oxidize the fuel to produce a current of electrons back toward the cathode.

That may seem simple enough in principle, but until now, SOFCs have been more suited for the laboratory rather than the office or garage. In two studies appearing in the Journal of Power Sources this month, Ramanathan's team reported several critical advances in SOFC technology that may quicken their pace to market.

In the first paper, Ramanathan's group demonstrated stable and functional all-ceramic thin-film SOFCs that do not contain any platinum.

In thin-film SOFCs, the electrolyte is reduced to a hundredth or even a thousandth of its usual scale, using densely packed layers of special ceramic films, each just nanometers in thickness. These micro-SOFCs usually incorporate platinum electrodes, but they can be expensive and unreliable.

"If you use porous metal electrodes," explains Ramanathan, "they tend to be inherently unstable over long periods of time. They start to agglomerate and create open circuits in the fuel cells."

Ramanathan's platinum-free micro-SOFC eliminates this problem, resulting in a win-win: lower cost and higher reliability.

In a second paper published this month, the team demonstrated a methane-fueled micro-SOFC operating at less than 500° Celsius, a feat that is relatively rare in the field.

Traditional SOFCs have been operating at about 800°C, but such high temperatures are only practical for stationary power generation. In short, using them to power up a smartphone mid-commute is not feasible.

In recent years, materials scientists have been working to reduce the required operating temperature to about 300°C, a range Ramanathan calls the "sweet spot."

Moreover, when fuel cells operate at lower temperatures, material reliability is less critical—allowing, for example, the use of less expensive ceramics and metallic interconnects—and the start-up time can be shorter.

"Low temperature is a holy grail in this field," says Ramanathan. "If you can realize high-performance solid-oxide fuel cells that operate in the 300°C range, you can use them in transportation vehicles and portable electronics, and with different types of fuels."

The use of methane, an abundant and cheap natural gas, in the team's SOFC was also of note. Until recently, hydrogen has been the primary fuel for SOFCs. Pure hydrogen, however, requires a greater amount of processing.

"It's expensive to make pure hydrogen," says Ramanathan, "and that severely limits the range of applications."

As methane begins to take over as the fuel of choice, the advances in temperature, reliability, and affordability should continue to reinforce each other.

"Future research at SEAS will explore new types of catalysts for methane SOFCs, with the goal of identifying affordable, earth-abundant materials that can help lower the operating temperature even further," adds Ramanathan.

(Photo: Juan Ignacio Sánchez Lara / Flickr)

Harvard School of Engineering and Applied Sciences

WHY DO PEOPLE BEHAVE BADLY? MAYBE IT'S JUST TOO EASY

0 comentarios
Many people say they wouldn't cheat on a test, lie on a job application or refuse to help a person in need. But what if the test answers fell into your lap and cheating didn't require any work on your part? If you didn't have to face the person who needed your help and refuse them? Would that change your behaviour?

New research out of the University of Toronto Scarborough shows it might. In two studies that tested participants' willingness to behave immorally, the UTSC team discovered people will behave badly – if it doesn't involve too much work on their part.

"People are more likely to cheat and make immoral decisions when their transgressions don't involve an explicit action," says Rimma Teper, PhD student and lead author on the study, published online now in Social Psychological and Personality Science. "If they can lie by omission, cheat without doing much legwork, or bypass a person's request for help without expressly denying them, they are much more likely to do so."

In one study, participants took a math test on a computer after being warned there were glitches in the system. One group was told if they pressed the space bar, the answer to the question would appear on the screen. The second group was told if they didn't press the enter key within five seconds of seeing a question, the answer would appear.

"People in the second group – those who didn't have to physically press a button to get the answers – were much more likely to cheat," says Associate Psychology Professor Michael Inzlicht, second author on the study.

In another study, the team asked participants whether they would volunteer to help a student with a learning disability complete a component of the test. One group of participants had only the option of checking a 'yes' or 'no' box that popped up on the computer. The second group of people could follow a link at the bottom of the page to volunteer their help or simply press 'continue' to move on to the next page of their test. Participants were five times more likely to volunteer when they had to expressly pick either 'yes' or 'no.'

"It seems to be more difficult for people to explicitly deny their help, by clicking 'no,' than it is for them to simply click 'continue' and elude doing the right thing. We suspect that emotion plays an important role in driving this effect" says Teper.

"When people are confronted with actively doing the right thing or the wrong thing, there are a lot of emotions involved – such as guilt and shame – that guide them to make the moral choice. When the transgression is more passive, however, we saw more people doing the wrong thing, and we believe this is because the moral emotions in such situations are probably less intense," Teper says.

The team's research on moral behaviour is unique in that it looks at how people behave in certain situations versus simply asking them to predict how they might behave, says Inzlicht. It also has critical implications for those in the business of soliciting peoples' good will, money or time.

"Forcing people to make an active, moral decision – a 'yes' or 'no' to donating, for example – is going to be much more effective than allowing them to passively skip over a request," he says.

University of Toronto

UNDERWATER ROBOTS ON COURSE TO THE DEEP SEA

0 comentarios

Robots do not have to breathe. For this reason they can dive longer than any human. Equipped with the necessary sensor technology they inspect docks or venture down to the ocean floor to search for raw materials. At present, researchers are developing a model which will carry out routine tasks independently, without help from humans.

Even when equipped with compressed-air bottles and diving regulators, humans reach their limits very quickly under water. In contrast, unmanned submarine vehicles that are connected by cable to the control center permit long and deep dives. Today remote-controlled diving robots are used for research, inspection and maintenance work. The possible applications of this technology are limited, however, by the length of the cable and the instinct of the navigator. No wonder that researchers are working on autonomous underwater robots which orient themselves under water and carry out jobs without any help from humans.

In the meantime, there are AUVs (autonomous underwater vehicles) which collect data independently or take samples before they return to the starting points. “For the time being, the technology is too expensive to carry out routine work, such as inspections of bulkheads, dams or ships’ bellies,” explains Dr. Thomas Rauschenbach, Director of the Application Center System Technology AST Ilmenau, Germany at the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation IOSB. This may change soon. Together with the researchers at four Fraunhofer Institutes, Rauschenbach’s team is presently working on a generation of autonomous underwater robots which will be smaller, more robust and cheaper than the previous models. The AUVs shall be able to find their bearings in clear mountain reservoirs equally well as in turbid harbor water. They will be suitable for work on the floor of the deep sea as well as for inspections of shallow concrete bases that offshore wind power station have been mounted on.
The engineers from Fraunhofer Institute for Optronics, System Technologies and Image Exploitation in Karlsruhe, Germany are working on the “eyes” for underwater robots. Optical perception is based on a special exposure and analysis technology which even permits orientation in turbid water as well. First of all, it determines the distance to the object, and then the camera emits a laser impulse which is reflected by the object, such as a wall. Microseconds before the reflected light flash arrives, the camera opens the aperture and the sensors capture the incident light pulses. At the Ilmenau branch of the Fraunhofer Institute for Optronics, System Technologies and Image Exploitation, Rauschenbach‘s team is developing the “brain“ of the robot: a control program that keeps the AUV on course in currents such as at a certain distance to the wall that is to be examined. The Fraunhofer Institute for Biomedical Engineering IBMT in St. Ingbert provides the silicone encapsulation for the pressure-tolerant construction of electronic circuits as well as the “ears” of the new robot: ultrasound sensors permit the inspection of objects. Contrary to the previously conventional sonar technology, researchers are now using high-frequency sound waves which are reflected by the obstacles and registered by the sensor. The powerful but lightweight lithium batteries of the Fraunhofer ISIT in Itzehoe that supply the AUV with energy are encapsulated by silicone. A special energy management system that researchers at the Fraunhofer Institute for Environmental, Safety and Energy Technology UMSICHT in Oberhausen, Germany have developed saves power and ensures that the data are saved in emergencies before the robot runs out of energy and has to surface.

A torpedo-shaped prototype two meters long that is equipped with eyes, ears, a brain, a motor and batteries will go on its maiden voyage this year in a new tank in Ilmenau. The tank is only three meters deep, but “that’s enough to test the decisive functions,” affirms Dr. Rauschenbach. In autumn 2011, the autonomous diving robot will put to sea for the first time from the research vessel POSEIDON: Several dives up to a depth of 6,000 meters have been planned.

(Photo: Fraunhofer AST)

Fraunhofer Institute

WELL-KNOWN MOLECULE MAY BE BEHIND ALCOHOLS BENEFITS TO HEART HEALTH

0 comentarios

Many studies support the assertion that moderate drinking is beneficial when it comes to cardiovascular health, and for the first time scientists have discovered that a well-known molecule, called Notch, may be behind alcohol’s protective effects. Down the road, this finding could help scientists create a new treatment for heart disease that mimics the beneficial influence of modest alcohol consumption.

“Any understanding of a socially acceptable, modifiable activity that many people engage in, like drinking, is useful as we continue to search for new ways to improve health,” said Eileen M. Redmond, Ph.D., lead study author and associate professor in the Department of Surgery, Basic and Translational Research Division, at the University of Rochester Medical Center. “If we can figure out at the basic science level how alcohol is beneficial it wouldn’t translate to doctors prescribing people to drink, but hopefully will lead to the development of a new therapy for the millions of people with coronary heart disease.”

Population studies looking at patterns of health and illness and associated factors have shown that heart disease and cardiac-related death is 20 to 40 percent lower in light to moderate drinkers, compared to people who don’t drink. Redmond notes that even if the reduction is only 20 percent, that still translates to a considerable benefit that warrants further investigation to better understand how alcohol works its protective magic.

In the study, published in Arteriosclerosis, Thrombosis and Vascular Biology, scientists found that alcohol at moderate levels of consumption – generally considered one to three drinks per day – inhibits Notch, and subsequently prevents the buildup of smooth muscle cells in blood vessels, which contributes to narrowing of the arteries and can lead to a heart attack or stroke.

In trying to uncover the molecular players involved when it comes to alcohol and improved cardiovascular health, Redmond and her team focused in on Notch because research has shown it influences the fate – growth, migration or death – of vascular smooth muscle cells. In blood vessels, the growth and movement of smooth muscle cells plays a key role in the development of atherosclerosis, the hardening and narrowing of arteries, and in restenosis, the re-narrowing of arteries after they have been treated to remove buildups of plaque: Both are risk factors for heart attack and stroke.

The team studied the effects of moderate amounts of alcohol in human coronary artery smooth muscle cells and in the carotid arteries of mice. In both scenarios, regular, limited amounts of alcohol decreased Notch, which in turn decreased the production and growth of smooth muscle cells, leaving vessels open and relatively free of blockages or build-up – a desirable state for a healthy heart.

Specifically, in human smooth muscle cells, treatment with moderate levels of alcohol significantly decreased the expression of the Notch 1 receptor and inhibited Notch signaling, leading to decreased growth of smooth muscle cells. The inhibitory effect of moderate alcohol on smooth muscle cell growth was reversed if the Notch pathway was artificially switched on in these cells.

In a mouse model of vessel remodeling, daily feeding of alcohol – equivalent to two drinks per day, adjusted for body weight – inhibited Notch in the vessel wall and markedly reduced vessel thickening, compared to the control, no alcohol group. Vessel remodeling occurs when vessels change shape and thickness in response to different injurious stimuli.

“At the molecular level, this is the first time anyone has linked the benefits of moderate drinking on cardiovascular disease with Notch,” said David Morrow, Ph.D., an instructor in the Department of Surgery at the Medical Center, first author of the study and an expert on Notch. “Now that we’ve identified Notch as a cell signaling pathway regulated by alcohol, we’re going to delve deeper into the nuts and bolts of the process to try to find out exactly how alcohol inhibits Notch in smooth muscle cells.”

Researchers admit that uncovering how alcohol inhibits Notch signaling in these cells will not be an easy task. According to Redmond, “The Notch pathway is complex, and there are multiple potential regulatory points which could be affected by alcohol.”

(Photo: U. Rochester)

University of Rochester Medical Center

ENGINEER PROVIDES NEW INSIGHT INTO PTERODACTYL FLIGHT

0 comentarios

Giant pterosaurs – ancient reptiles that flew over the heads of dinosaurs – were at their best in gentle tropical breezes, soaring over hillsides and coastlines or floating over land and sea on thermally driven air currents, according to new research from the University of Bristol.

Pterosaurs (also referred to as pterodactyls) were too slow and flexible to use the stormy winds and waves of the southern ocean like the albatrosses of today, the research by Colin Palmer, an engineer turned paleontology PhD student in Bristol’s School of Earth Sciences, found.

Their slow flight and the variable geometry of their wings also enabled pterosaurs to land very gently, reducing the chance of breaking their paper- thin bones. This helps to explain how they were able to become the largest flying animals ever known.

Using his 40 years of experience in the engineering industry, Colin Palmer constructed models of pterosaur wing sections from thin, curved sheets of epoxy resin/carbon fibre composite and tested them in a wind tunnel. These tests quantified the two-dimensional characteristics of pterosaur wings for the first time, showing that such creatures were significantly less aerodynamically efficient and were capable of flying at lower speeds than previously thought.

Colin Palmer said: “Pterosaur wings were adapted to a low-speed flight regime that minimizes sink rate. This regime is unsuited to marine style dynamic soaring adopted by many seabirds which requires high flight speed coupled with high aerodynamic efficiency, but is well suited to thermal/slope soaring. The low sink rate would have allowed pterosaurs to use the relatively weak thermal lift found over the sea.

“Since the bones of pterosaurs were thin-walled and thus highly susceptible to impact damage, the low-speed landing capability would have made an important contribution to avoiding injury and so helped to enable pterosaurs to attain much larger sizes than extant birds. The trade-off would have been an extreme vulnerability to strong winds and turbulence, both in flight and on the ground, like that experienced by modern-day paragliders.”

The research is published in Proceedings of the Royal Society B.

Colin Palmer is an engineer by training, originally in ship science. An interest in what makes sailing vessels go led him to study low speed aerodynamics and the performance of thin airfoils. He is now applying this background to the analysis of vertebrate flight, focussing on large pterosaurs for his PhD. Using a combination of wind tunnel and vortex-lattice theoretical modelling, he aims to understand how pterosaur wings performed. Working with students in the Department of Aerospace Engineering, he will also undertake more sophisticated aerodynamic analysis using computational fluid dynamics. He ultimately wants to put all this information together to create a free-flying model of a pterosaur.

(Photo: © Ontograph Studios Ltd)

University of Bristol

STUDY REVEALS NEURAL BASIS OF RAPID BRAIN ADAPTATION

0 comentarios

You detect an object flying at your head. What do you do? You probably first move out of the way -- and then you try to determine what the object is. Your brain is able to quickly switch from detecting an object moving in your direction to determining what the object is through a phenomenon called adaptation.

A new study in the Nov. 21 advance online edition of the journal Nature Neuroscience details the biological basis of this ability for rapid adaptation: neurons located at the beginning of the brain's sensory information pathway that change their level of simultaneous firing. This modification in neuron firing alters the nature of the information being relayed, which enhances the brain's ability to discriminate between different sensations -- at the expense of degrading its ability to detect the sensations themselves.

"Previous studies have focused on how brain adaptation influences how much information from the outside world is being transmitted by the thalamus to the cortex, but we show that it is also important to focus on what information is being transmitted," said Garrett Stanley, an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University.

In addition to Stanley, Coulter Department research scientist Qi Wang and Harvard Medical School Neurobiology Department research fellow Roxanna Webber contributed to this work, which is supported by the National Institutes of Health.

For the experiments, Stanley and Wang moved a rat's whisker to generate a sensory input. Moving whiskers at different speeds or at different angles produced sensory inputs that could be discriminated. This sensory experience is analogous to an individual moving a fingertip across a surface and perceiving the surface as smooth or rough. While the whiskers were being moved, the researchers recorded neural signals simultaneously from different parts of the animal's brain to determine what information was being transmitted.

"Neuroscientists know a lot about different parts of the brain, but we don't know a lot about how they talk to each other. Recording how neurons are simultaneously communicating with each other in different parts of the brain and studying how the communication changes in different situations is a big step in this field," said Stanley.

The results from the experiments showed that adaptation shifted neural activity from a state in which the animal was good at detecting the presence of a sensory input to a state in which the animal was better at discriminating between sensory inputs. In addition, adaptation enhanced the ability to discriminate between deflections of the whiskers in different angular directions, pointing to a general phenomenon.

"Adaptation differentially influences the thalamus and cortex in a manner that fundamentally changes the nature of information conveyed about whisker motion," explained Stanley. "Our results provide a direct link between the long-observed phenomenon of enhanced sensory performance with adaptation and the underlying neurophysiological representation in the primary sensory cortex."

The thalamus serves as a relay station between the outside world and the cortex. Areas of the cortex receive and process information related to vision, audition and touch from the thalamus.

The study also revealed that information the cortex receives from the thalamus is transformed as it travels through the pathway due to a change in the level of simultaneous firing of neurons in the thalamus. The researchers found that the effect of adaptation on the synchrony of neurons in the thalamus was the key element in the shift between sensory input detection and discrimination.

"There is a switching of the circuit to a different function. The same neurons do two different things and switch quickly, in a matter of seconds or milliseconds, through a change in the synchronization across neurons," explained Stanley. "If we think of the neurons firing like members of an audience clapping hands, then the sound of the clapping becomes louder when they all clap together."

In the future, the techniques used in this study may be valuable for probing the effects of brain injury on this pathway and others, as a variety of different diseases and disorders act to change the degree of synchronization of neurons in the brain, resulting in harmful effects.

(Photo: GIT)

Georgia Institute of Technology

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com