Saturday, February 5, 2011

STRESS, ANXIETY BOTH BOON AND BANE TO BRAIN

0 comentarios
A cold dose of fear lends an edge to the here-and-now — say, when things go bump in the night.

"That edge sounds good. It sounds adaptive. It sounds like perception is enhanced and that it can keep you safe in the face of danger," says Alexander Shackman, a researcher at the University of Wisconsin-Madison.

But it sounds like there's also a catch, one that Shackman and his coauthors — including Richard Davidson, UW-Madison psychology and psychiatry professor — described in the Jan. 19 Journal of Neuroscience.

"It makes us more sensitive to our external surroundings as a way of learning where or what a threat may be, but interferes with our ability to do more complex thinking," Davidson says.

Faced with the possibility of receiving an unpleasant electric shock, the study's subjects showed enhanced activity in brain circuits responsible for taking in visual information, but a muted signal in circuitry responsible for evaluating that information. Remove the threat of shock (and thus the stress and anxiety) and the effect is reversed: less power for vigilance, more power for strategic decision-making.

The shift in electrical activity in the brain, captured by a dense mesh of sensors placed on the scalp, may be the first biological description of a paradox in experimental psychology.

It has long been known that imminent danger can enhance the ability to detect faint stimuli in the environment, such as the crackle of a leaf signaling the approach of a predator. But it is equally clear that the stress and anxiety aroused by a threat can profoundly disrupt the ability to think clearly and perform more complex "executive" tasks.

"In the last few years, theorists have hypothesized that this paradox might reflect several systems working in conjunction: one responsible for the rapid detection of external stimuli, the other responsible for the slower, more reflective evaluation of that incoming information," Shackman says. "Stress upsets the balance of those systems."

In fact, as the senses go into overdrive, they are probably confounding the rest of the brain all the more.

"Your ability to do more complex tasks is disrupted just as the amount of information you're receiving through your eyes and ears is enhanced," Shackman says. "You're having trouble focusing on the information coming in, but your brain is taking in more and more potentially irrelevant information. You can have a viscous feedback loop, a sort of double-whammy effect."

The resulting confusion favors quick, reflexive actions, the "survival instincts" often mentioned by trauma survivors — Noise? RUN! — in a way that was likely adaptive in the dangerous environments in which the ancestors to modern humans evolved.

"In our evolutionary past, the dangers we faced were really survival-threatening," Davidson says. "That's not so much the case now. Because of the nature of our brains, we can use our neural capacity to create our own internal danger. We can worry about the future and ruminate about the past."

Either one is likely to present a real hurdle to effective decision-making under stress.

"This is part of a growing body of evidence showing that stress does have important consequences for the brain, not just something that arouses the body — tension in your muscles or butterflies in the stomach," says Davidson, who studies the effects of meditation as director of UW-Madison's Center for Investigating Healthy minds.

"One of the things we would expect is that if we use an antidote like systematic meditation training to learn to control stress it would not just calm the body, but improve our ability to engage in complex analytical activity," he says.

University of Wisconsin-Madison

WHY DO OUR EMOTIONS GET IN THE WAY OF RATIONAL DECISIONS ABOUT SAFETY PRODUCTS?

0 comentarios
A new study in the Journal of Consumer Research explores why people reject things that can make them safer.

"People rely on airbags, smoke detectors, and vaccines to make them safe," write authors Andrew D. Gershoff (University of Texas at Austin) and Johnathan J. Koehler (Northwestern University School of Law). "Unfortunately, vaccines do sometimes cause disease and airbags sometimes injure or kill. But just because these devices aren't perfect doesn't mean consumers should reject them outright."

The authors found that people feel betrayed when they learn about the risks associated with safety products. Then their emotions get in the way of rational decision-making. The researchers studied the "betrayal effect" by looking at the example of airbags. They asked participants to choose between two cars: One was equipped with an airbag that was less likely to ultimately save a life in the event of a serious accident. The other car had an airbag that was more likely to save a life, but it also had a tiny chance of causing death due to the force of deploying it.

Most participants avoided the airbag that had just a miniscule chance of harming them, even though by doing so, they accepted a far greater chance of being harmed in an accident.

"The findings show that people have strong emotional reactions when such safety devices have even a very small potential to betray them," the authors write. "So rather than weighing the costs and benefits, they will reject these options outright, even if it makes them worse off for doing so."

The authors found that providing positive images helped people make safer choices as well as presenting the information in a graphic format to facilitate rational thinking that allowed consumers to easily compare and not overemphasize risk.

Finally, the authors found that people could be influenced to make safer choices by having them make their choices for strangers rather than for themselves. "Although this last method may seem contradictory, it makes sense when one considers that people tend to be less emotional about making choices that don't involve themselves or people they care for," the authors conclude.

The University of Chicago Press

RESEARCH DISCOVERS WHY FIRST IMPRESSIONS ARE SO PERSISTENT

0 comentarios
New research by a team of psychologists from Canada, Belgium, and the United States shows there is more than a literal truth to the saying that ‘you never get a second chance to make a first impression’. The findings suggest that new experiences that contradict a first impression become ‘bound’ to the context in which they were made. As a result, the new experiences influence people’s reactions only in that particular context, whereas first impressions still dominate in other contexts.

"Imagine you have a new colleague at work and your impression of that person is not very favourable” explains lead author Bertram Gawronski, Canada Research Chair at The University of Western Ontario. “A few weeks later, you meet your colleague at a party and you realize he is actually a very nice guy. Although you know your first impression was wrong, your gut response to your new colleague will be influenced by your new experience only in contexts that are similar to the party. However, your first impression will still dominate in all other contexts.”

According to Gawronski, our brain stores expectancy-violating experiences as exceptions-to-the-rule, such that the rule is treated as valid except for the specific context in which it has been violated.

To investigate the persistence of first impressions, Gawronski and his collaborators showed their study participants either positive or negative information about an unknown individual on a computer screen. Later in the study, participants were presented with new information about the same individual, which was inconsistent with the initial information. To study the influence of contexts, the researchers subtly changed the background color of the computer screen while participants formed an impression of the target person.

When the researchers subsequently measured participants’ spontaneous reactions to an image of the target person, they found the new information influenced participants’ reactions only when the person was presented against the background in which the new information had been learned. Otherwise, participants’ reactions were still dominated by the first information when the target person was presented against other backgrounds.

Although these results support the common observation that first impressions are notoriously persistent, Gawronski notes they can sometimes be changed. “What is necessary is for the first impression to be challenged in multiple different contexts. In that case, new experiences become decontextualized and the first impression will slowly lose its power. But, as long as a first impression is challenged only within the same context, you can do whatever you want. The first impression will dominate regardless of how often it is contradicted by new experiences.”

According to Gawronski, the research also has important implications for the treatment of clinical disorders. “If someone with phobic reactions to spiders is seeking help from a psychologist, the therapy will be much more successful if it occurs in multiple different contexts rather than just in the psychologist’s office.”

The University of Western Ontario

EFFECTIVE USE OF POWER IN THE BRONZE AGE SOCIETIES OF CENTRAL EUROPE

0 comentarios
During the first part of the Bronze Age in the Carpathian Basin in Central Europe, a large proportion of the population lived in what are known as tell-building societies. A thesis in archaeology from the University of Gothenburg (Sweden) shows that the leaders of these societies had the ability to combine several sources of power in an effective way in order to dominate the rest of the population, which contributed towards creating a notably stable social system.

Tell-building societies are named after a distinct form of settlements with a high density of population and construction, which over the course of time accumulated such thick cultural layers that they took on the shape of low mounds.

On the basis of a discussion and analysis of previously published material from the Carpathian Basin and new findings from the tell settlement Százhalombatta-Földvár in Hungary, the author of the thesis, Claes Uhnér, describes the ways in which leaders could exercise power. Tell-building societies had relatively advanced economies. The subsistence economy, which was based on agricultural production and animal husbandry, produced a good return, and the societies were involved in regional and long-distance exchange of bronzes and other valuable craft products.

"By exercising a degree of control over these parts of the economy, it was possible for leaders to finance political activities and power-exerting organisations," says Uhnér. He shows in his thesis that, through military power, leaders were able to control surrounding settlements from fortified tells. As the majority of these settlements were situated next to rivers and other natural transport routes, they could demand tribute from passing trade expeditions and act as intermediaries in the exchange of goods that took place in the region. In addition, a large tell was a manifestation of a successful society with a long history. This situation made it possible for leaders to use the cultural traditions of the society in ideological power strategies.

"The tells served as physical manifestations of a social system that worked well, which legitimised the social position of the elites and their right to lead. An important conclusion drawn by Uhnér is that the sources of power could be used in strategies where they supported each other. Economic power made it possible to master military and ideological means of power. Military power was utilised to safeguard economic and ideological resources, while ideology legitimised the social system. This was largely possible because the tell settlements served as political power centres. Redistribution of staples and specialised production was attached to these sites, and they had key military and ideological significance. "By controlling tells and the activities carried out in them, leaders had an organisational advantage over the rest of the population, and others found it very difficult to build up competing power positions," says Uhnér.

During the first part of the Bronze Age in the Carpathian Basin in Central Europe, a large proportion of the population lived in what are known as tell-building societies. A thesis in archaeology from the University of Gothenburg shows that the leaders of these societies had the ability to combine several sources of power in an effective way in order to dominate the rest of the population, which contributed towards creating a notably stable social system. Tell-building societies are named after a distinct form of settlements with a high density of population and construction, which over the course of time accumulated such thick cultural layers that they took on the shape of low mounds.

On the basis of a discussion and analysis of previously published material from the Carpathian Basin and new findings from the tell settlement Százhalombatta-Földvár in Hungary, the author of the thesis, Claes Uhnér, describes the ways in which leaders could exercise power. Tell-building societies had relatively advanced economies. The subsistence economy, which was based on agricultural production and animal husbandry, produced a good return, and the societies were involved in regional and long-distance exchange of bronzes and other valuable craft products. "By exercising a degree of control over these parts of the economy, it was possible for leaders to finance political activities and power- exerting organisations," says Uhnér. He shows in his thesis that, through military power, leaders were able to control surrounding settlements from fortified tells. As the majority of these settlements were situated next to rivers and other natural transport routes, they could demand tribute from passing trade expeditions and act as intermediaries in the exchange of goods that took place in the region. In addition, a large tell was a manifestation of a successful society with a long history. This situation made it possible for leaders to use the cultural traditions of the society in ideological power strategies.

"The tells served as physical manifestations of a social system that worked well, which legitimised the social position of the elites and their right to lead. An important conclusion drawn by Uhnér is that the sources of power could be used in strategies where they supported each other. Economic power made it possible to master military and ideological means of power. Military power was utilised to safeguard economic and ideological resources, while ideology legitimised the social system. This was largely possible because the tell settlements served as political power centres. Redistribution of staples and specialised production was attached to these sites, and they had key military and ideological significance. "By controlling tells and the activities carried out in them, leaders had an organisational advantage over the rest of the population, and others found it very difficult to build up competing power positions," says Uhnér. The thesis has been successfully defended.

University of Gothenburg (Sweden)

SULPHUR PROVES IMPORTANT IN THE FORMATION OF GOLD MINES

0 comentarios
Collaborating with an international research team, an economic geologist from The University of Western Ontario has discovered how gold-rich magma is produced, unveiling an all-important step in the formation of gold mines.

The findings were published in the December issue of Nature Geoscience.

Robert Linnen, the Robert Hodder Chair in Economic Geology in Western's Department of Earth Sciences conducts research near Kirkland Lake, Ontario and says the results of the study could lead to a breakthrough in choosing geographic targets for gold exploration and making exploration more successful.

Noble metals, like gold, are transported by magma from deep within the mantle (below the surface) of the Earth to the shallow crust (the surface), where they form deposits. Through a series of experiments, Linnen and his colleagues from the University of Hannover (Germany), the University of Potsdam (Germany) and Laurentian University found that gold-rich magma can be generated in mantle also containing high amounts of sulphur.

"Sulphur wasn't recognized as being that important, but we found it actually enhances gold solubility and solubility is a very important step in forming a gold deposit," explains Linnen. "In some cases, we were detecting eight times the amount of gold if sulphur was also present."

Citing the World Gold Council, Linnen says the best estimates available suggest the total volume of gold mined up to the end of 2009 was approximately 165,600 tonnes. Approximately 65 per cent of that total has been mined since 1950.

"All the easy stuff has been found," offers Linnen. "So when you project to the future, we're going to have to come up with different ways, different technologies and different philosophies for finding more resources because the demand for resources is ever-increasing."

The University of Western Ontario

WILDFLOWER COLORS TELL BUTTERFLIES HOW TO DO THEIR JOBS

0 comentarios

The recipe for making one species into two requires time and some kind of separation, like being on different islands or something else that discourages gene flow between the two budding species.

In the case of common Texas wildflowers that share meadows and roadside ditches, color-coding apparently does the trick.

Duke University graduate student Robin Hopkins has found the first evidence of a specific genetic change that helps two closely related wildflowers avoid creating costly hybrids. It results in one of the normally light blue flowers being tagged with a reddish color to appear less appetizing to the pollinating butterflies which prefer blue.

"There are big questions about evolution that are addressed by flower color," said Hopkins, who successfully defended her doctoral dissertation just weeks before seeing the same work appear in the prestigious journal Nature.

What Hopkins found, with her thesis adviser, Duke biology professor Mark Rausher, is the first clear genetic evidence for something called reinforcement in plants. Reinforcement keeps two similar proto-species moving apart by discouraging hybrid matings. Flower color had been expected to aid reinforcement, but the genes had not been found.

In animals or insects, reinforcement might be accomplished by a small difference in scent, plumage or mating rituals. But plants don't dance or choose their mates. So they apparently exert some choice by using color to discourage the butterflies from mingling their pollen, Hopkins said.

Where Phlox drummondii lives by itself, it has a periwinkle blue blossom. But where its range overlaps with Phlox cuspidata, which is also light blue, drummondii flowers appear darker and more red. Some individual butterflies prefer light blue blossoms and will go from blue to blue, avoiding the dark reds. Other individual butterflies prefer the reds and will stick with those. This "constancy" prevents hybrid crosses.

Hybrid offspring between drummondii and cuspidata turn out to be nearly sterile, making the next generation a genetic dead-end. The persistent force of natural selection tends to push the plants toward avoiding those less fruitful crosses, and encourages breeding true to type. In this case, selection apparently worked upon floral color.

Hopkins was able to find the genes involved in the color change by crossing a light blue drummondii with the red in greenhouse experiments. She found the offspring occurred in four different colors in the exact 9-to-3-to-3-to-1 ratios of classical Mendelian inheritance. "It was 2 in the morning when I figured this out," she said. "I almost woke up my adviser."

From there, she did standard genetics to find the exact genes. The change to red is caused by a recessive gene that knocks out the production of the plant's one blue pigment while allowing for the continued production of two red pigments.

Even where the red flowers are present, about 11 percent of each generation will be the nearly-sterile hybrids. But without color-coding, that figure would be more like 28 percent, Hopkins said. Why and how the butterflies make the distinction has yet to be discovered.

Hopkins will be continuing her research as a visiting scientist at the University of Texas, and the clear message from all of her advisers is "follow the butterflies. Everyone wants to know more about the butterflies!"

(Photo: Robin Hopkins)

Duke University

WIDESPREAD ANCIENT OCEAN "DEAD ZONES" CHALLENGED EARLY LIFE

0 comentarios

The oceans became oxygen-rich as they are today about 600 million years ago, during Earth's Late Ediacaran Period. Before that, most scientists believed until recently, the ancient oceans were relatively oxygen-poor for the preceding four billion years.

Now biogeochemists at the University of California-Riverside (UCR) have found evidence that the oceans went back to being "anoxic," or oxygen-poor, around 499 million years ago, soon after the first appearance of animals on the planet.

They remained anoxic for two to four million years.

The researchers suggest that such anoxic conditions may have been commonplace over a much broader interval of time.

"This work is important at many levels, from the steady growth of atmospheric oxygen in the last 600 million years, to the potential impact of oxygen level fluctuations on early evolution and diversification of life," said Enriqueta Barrera, program director in the National Science Foundation (NSF)'s Division of Earth Sciences, which funded the research.

The researchers argue that such fluctuations in the oceans' oxygen levels are the most likely explanation for what drove the explosive diversification of life forms and rapid evolutionary turnover that marked the Cambrian Period some 540 to 488 million years ago.

They report in the journal Nature that the transition from a generally oxygen-rich ocean during the Cambrian to the fully oxygenated ocean we have today was not a simple turn of the switch, as has been widely accepted until now.

"Our research shows that the ocean fluctuated between oxygenation states 499 million years ago," said paper co-author Timothy Lyons, a UCR biogeochemist and co-author of the paper.

"Such fluctuations played a major, perhaps dominant, role in shaping the early evolution of animals on the planet by driving extinction and clearing the way for new organisms to take their place."

Oxygen is necessary for animal survival, but not for the many bacteria that thrive in and even demand life without oxygen.

Understanding how the environment changed over the course of Earth's history can give scientists clues to how life evolved and flourished during the critical, very early stages of animal evolution.

"Life and the environment in which it lives are intimately linked," said Benjamin Gill, the first author of the paper, a biogeochemist at UCR, and currently a postdoctoral researcher at Harvard University.

When the ocean's oxygenation states changed rapidly in Earth's history, some organisms were not able to cope.

Oceanic oxygen affects cycles of other biologically important elements such as iron, phosphorus and nitrogen.

"Disruption of these cycles is another way to drive biological crises," Gill said. "A switch to an oxygen-poor state of the ocean can cause major extinction of species."

The researchers are now working to find an explanation for why the oceans became oxygen-poor about 499 million years ago.

"We have the 'effect,' but not the 'cause,'" said Gill.

"The oxygen-poor state persisted likely until the enhanced burial of organic matter, originally derived from oxygen-producing photosynthesis, resulted in the accumulation of more oxygen in the atmosphere and ocean

"As a kind of negative feedback, the abundant burial of organic material facilitated by anoxia may have bounced the ocean to a more oxygen-rich state."

Understanding past events in Earth's distant history can help refine our view of changes happening on the planet now, said Gill.

"Today, some sections of the world's oceans are becoming oxygen-poor--the Chesapeake Bay (surrounded by Maryland and Virginia) and the so-called 'dead zone' in the Gulf of Mexico are just two examples," he said.

"We know the Earth went through similar scenarios in the past. Understanding the ancient causes and consequences can provide essential clues to what the future has in store for our oceans."

The team examined the carbon, sulfur and molybdenum contents of rocks they collected from localities in the United States, Sweden, and Australia.

Combined, these analyses allowed the scientists to infer the amount of oxygen present in the ocean at the time the limestones and shales were deposited.

By looking at successive rock layers, they were able to compile the biogeochemical history of the ocean.

Lyons and Gill were joined in the research by Seth Young of Indiana University, Bloomington; Lee Kump of Pennsylvania State University; Andrew Knoll of Harvard University; and Matthew Saltzman of Ohio State University.

(Photo: Ben Gill, UC-Riverside and Harvard University)

National Science Foundation

NEW GLASS TOPS STEEL IN STRENGTH AND TOUGHNESS

0 comentarios

Glass stronger and tougher than steel? A new type of damage-tolerant metallic glass, demonstrating a strength and toughness beyond that of any known material, has been developed and tested by a collaboration of researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab)and the California Institute of Technology. What’s more, even better versions of this new glass may be on the way.

“These results mark the first use of a new strategy for metallic glass fabrication and we believe we can use it to make glass that will be even stronger and more tough,” says Robert Ritchie, a materials scientist who led the Berkeley contribution to the research.

The new metallic glass is a microalloy featuring palladium, a metal with a high “bulk-to-shear” stiffness ratio that counteracts the intrinsic brittleness of glassy materials.

“Because of the high bulk-to-shear modulus ratio of palladium-containing material, the energy needed to form shear bands is much lower than the energy required to turn these shear bands into cracks,” Ritchie says. “The result is that glass undergoes extensive plasticity in response to stress, allowing it to bend rather than crack.”

Ritchie, who holds joint appointments with Berkeley Lab’s Materials Sciences Division and the University of California (UC) Berkeley’s Materials Science and Engineering Department, is one of the co-authors of a paper describing this research published in the journal Nature Materials under the title “A Damage-Tolerant Glass.”

Co-authoring the Nature Materials paper were Marios Demetriou (who actually made the new glass), Maximilien Launey, Glenn Garrett, Joseph Schramm, Douglas Hofmann and William Johnson of Caltech, one of the pioneers in the field of metallic glass fabrication.

Glassy materials have a non-crystalline, amorphous structure that make them inherently strong but invariably brittle. Whereas the crystalline structure of metals can provide microstructural obstacles (inclusions, grain boundaries, etc.,) that inhibit cracks from propagating, there’s nothing in the amorphous structure of a glass to stop crack propagation. The problem is especially acute in metallic glasses, where single shear bands can form and extend throughout the material leading to catastrophic failures at vanishingly small strains.

In earlier work, the Berkeley-Caltech collaboration fabricated a metallic glass, dubbed “DH3,” in which the propagation of cracks was blocked by the introduction of a second, crystalline phase of the metal. This crystalline phase, which took the form of dendritic patterns permeating the amorphous structure of the glass, erected microstructural barriers to prevent an opened crack from spreading. In this new work, the collaboration has produced a pure glass material whose unique chemical composition acts to promote extensive plasticity through the formation of multiple shear bands before the bands turn into cracks.

“Our game now is to try and extend this approach of inducing extensive plasticity prior to fracture to other metallic glasses through changes in composition,” Ritchie says. “The addition of the palladium provides our amorphous material with an unusual capacity for extensive plastic shielding ahead of an opening crack. This promotes a fracture toughness comparable to those of the toughest materials known. The rare combination of toughness and strength, or damage tolerance, extends beyond the benchmark ranges established by the toughest and strongest materials known.”

The initial samples of the new metallic glass were microalloys of palladium with phosphorous, silicon and germanium that yielded glass rods approximately one millimeter in diameter. Adding silver to the mix enabled the Cal Tech researchers to expand the thickness of the glass rods to six millimeters. The size of the metallic glass is limited by the need to rapidly cool or “quench” the liquid metals for the final amorphous structure.

“The rule of thumb is that to make a metallic glass we need to have at least five elements so that when we quench the material, it doesn’t know what crystal structure to form and defaults to amorphous,” Ritchie says.

The new metallic glass was fabricated by co-author Demetriou at Caltech in the laboratory of co-author Johnson. Characterization and testing was done at Berkeley Lab by Ritchie’s group.

“Traditionally strength and toughness have been mutually exclusive properties in materials, which makes these new metallic glasses so intellectually exciting,” Ritchie says. “We’re bucking the trend here and pushing the envelope of the damage tolerance that’s accessible to a structural metal.”

(Photo: Roy Kaltschmidt, Berkeley Lab Public Affairs)

Lawrence Berkeley National Laboratory

“TIMING IS EVERYTHING” IN ENSURING HEALTHY BRAIN DEVELOPMENT

0 comentarios

Connections made in our brains during the early years of our life could be the key to healthy mental development, Newcastle University scientists have found.

Work published recently shows that brain cells need to create links early on in their existence, when they are physically close together, to ensure successful connections across the brain throughout life.

In people, these long-distance connections enable the left and right side of the brain to communicate and integrate different kinds of information such as sound and vision. A change in the number of these connections has been found in many developmental brain disorders including autism, epilepsy and schizophrenia.

The Newcastle University researchers Dr Marcus Kaiser and Mrs Sreedevi Varier carried out a sophisticated computer analysis relating birth-time associated data to connectivity patterns of nerve cells in the roundworm, Caenorhabditis elegans. They demonstrated that when two nerve cells develop close together, they form a connection which then stretches out when the two nerve cells move apart as the organism grows. This creates a link across the brain known as a long-distance connection.

Publishing in PLoS Computational Biology, the researchers have demonstrated for the first time that this is the most frequent successful mechanism by which long distance connections are made.

Dr Marcus Kaiser, from the Institute of Neuroscience and the School of Computing Science at Newcastle University, says: “You can draw parallels with childhood friendships carrying on into adulthood. For example, two children living close to each other could become friends through common activities like school or playing at the park. The friendship can last even if one of them moves further away, while, beginning a lasting friendship with someone already far away, is much more difficult.”

Mrs Sreedevi Varier adds: “Although it’s too early for this research to have direct clinical applications, it adds to our understanding of the structural changes in the brain and raises some interesting questions as to how these connections can become faulty. In further studying this mechanism, we may eventually contribute towards insights into the diagnosis and possibly the treatment of patients with epilepsy and autism.”

It has long been understood that the first connections in the brain created in the early days of development can be formed over long distances using guidance signals to direct nerve fibres to their correct positions – known as axonal guidance. Subsequently, other connections can follow those pioneer fibres to a target location creating connections between distant parts of the brain. Through these long-distance connections different kinds of information, such as sound and vision, can be integrated.

This EPSRC-funded research showed that most neurons are able to create a connection early on in their development when they were physically close together, potentially giving them more time to host and establish connections. These developed into a long-distance connection, the two cells pulling apart as the organism grows larger. Studying the connections in the neuronal network of the roundworm Caenorhabditis elegans the Newcastle scientists - who are also affiliated with Seoul National University, Korea - found that most neurons with a long-distance connection had developed in this way.

This new mechanism differs from the previous model for long-distance connectivity. An axon is a fibre that is extended from one nerve cell and, after travelling through the tissue, can contact several other nerve cells. Normally, axons would grow in a straight line. For several targets, however, the axon has to travel around obstacles, as a straight connection is not possible. In such cases, cells along the way can release guidance cues that either attract or repulse the travelling axon. One example of bended fibres is the visual pathway that at several points takes a sharp 90-degree turn to arrive at the correct target position.

Instead, establishing potential links early on when neurons are spatially nearby might reduce the need for such guidance cues. This reduces costs in producing guidance cues but potentially also for genetically encoding a wider range of cues. An early mechanism opens up the possibility that changes in long-distance brain connectivity, that are observed in children and young adults with brain disorders, arise earlier during brain development than previously thought. These are questions that the team continue to work on through data analysis and computer simulations of brain development.

(Photo: Newcastle U.)

Newcastle University

GENE HELPS PLANTS USE LESS WATER WITHOUT BIOMASS LOSS

0 comentarios
Purdue University researchers have found a genetic mutation that allows a plant to better endure drought without losing biomass, a discovery that could reduce the amount of water required for growing plants and help plants survive and thrive in adverse conditions.

Plants can naturally control the opening and closing of stomata, pores that take in carbon dioxide and release water. During drought conditions, a plant might close its stomata to conserve water. By doing so, however, the plant also reduces the amount of carbon dioxide it can take in, which limits photosynthesis and growth.

Mike Mickelbart, an assistant professor of horticulture; Mike Hasegawa, a professor of horticulture; and Chal Yul Yoo, a horticulture graduate student, found that a genetic mutation in the research plant Arabidopsis thaliana reduces the number of stomata. But instead of limiting carbon dioxide intake, the gene creates a beneficial equilibrium.

"The plant can only fix so much carbon dioxide. The fewer stomata still allow for the same amount of carbon dioxide intake as a wild type while conserving water," said Mickelbart, whose results were published in the early online version of the journal The Plant Cell. "This shows there is potential to reduce transpiration without a yield penalty."

Mickelbart and Yoo used an infrared gas analyzer to determine the amount of carbon dioxide taken in and water lost in the Arabidopsis mutant. Carbon dioxide is pumped into a chamber with the plant and the analyzer measures the amount left after a plant has started to take up the gas. A similar process measures water lost through transpiration, in which water is released from a plant's leaves.

Analysis showed that the plant, which has a mutant form of the gene GTL1, did not reduce carbon dioxide intake but did have a 20 percent reduction in transpiration. The plant had the same biomass as a wild type of Arabidopsis when its shoot dry weight was measured.

"The decrease in transpiration leads to increased drought tolerance in the mutant plants," Yoo said. "They will hold more water in their leaves during drought stress."

Of the 20 genes known to control stomata, SDD1 was highly expressed in the mutant. SDD1 is a gene that is responsible for regulating the number of stomata on leaves. In the mutant, with GTL1 not functioning, SDD1 is highly expressed, which results in the development of fewer stomata.

Mickelbart said the finding is important because it opens the possibility that there is a natural way to improve crop drought tolerance without decreasing biomass or yield. He said the next step in the research is to determine the role of GTL1 in a crop plant.

Purdue University

STUDY FINDS ENERGY LIMITS GLOBAL ECONOMIC GROWTH

0 comentarios
A study that relates global energy use to economic growth, published in the January issue of BioScience, finds strong correlations between these two measures both among countries and within countries over time. The research leads the study's authors to infer that energy use limits economic activity directly. They conclude that an "enormous" increase in energy supply will be required to meet the demands of projected world population growth and lift the developing world out of poverty without jeopardizing standards of living in most developed countries.

The study, which used a macroecological approach, was based on data from the International Energy Agency and the World Resources Institute. It was conducted by a team of ecologists led by James H. Brown of the University of New Mexico. The team found the same sort of relationship between energy consumption per person and gross domestic product per person as is found between metabolism and body weight in animals. Brown's group suggests the similarity is real: Cities and countries, like animals, have metabolisms that must burn fuel to sustain themselves and grow. This analogy, together with the data and theory, persuades the BioScience authors that the linkage between energy use and economic activity is causal, although other factors must also be in play to explain the variability in the data.

The study goes on to show that variables relating to standard of living, such as the proportion of doctors in a population, the number of televisions per person, and infant mortality rate, are also correlated with both energy consumption per person and gross domestic product per person. These correlations lead the authors to their conclusions about the increases in energy production necessary to sustain a still-growing world population without drops in living standards. To support the expected world population in 2050 in the current US lifestyle would require 16 times the current global energy use, for example. Noting that 85 percent of humankind's energy now comes from fossil fuels, the BioScience authors point out that efforts to develop alternative energy sources face economic problems of diminishing returns, and reject the view of many economists that technological innovation can circumvent resource shortages.

American Institute of Biological Sciences

YOUR GREENS TO IMPROVE YOUR LOOKS

0 comentarios

Getting your five a day will do more for your looks than a sun tan according to scientists who have found that our appearances really do prove that you are what you eat.

Describing their findings in the journal, Evolution and Human Behavior, the team of researchers prove that eating plenty of fruit and vegetables is by far the most effective way to achieve a healthy, golden glow.

“Most people in the West think that the best way to improve your skin colour is to get a suntan,” said Dr Ian Stephen, lead researcher on the project and an ESRC post-doctoral fellow in the Department of Experimental Psychology at the University of Bristol, “but our research shows that eating lots of fruit and vegetables is actually more effective.”

The team, working at the Perception Lab at the University of St Andrews in Scotland, first assessed the skin colour of people in relation to their diet. Those who ate more portions of fruit and vegetables a day were found to have a more golden, yellow skin colour. Further analyses using a scientific instrument called a spectrophotometer measured the way that light in different parts of the spectrum is absorbed by the skin, revealing that those with a healthy glow had a higher presence of carotenoids, which are yellow and red antioxidants thought to play a role in the immune system and fertility. Carotenoids are commonly found in fruit and vegetables such as yellow and red peppers, spinach, apricots and melons.

In the second part of the study, the team used specialist computer software to manipulate the skin colour on the images of 51 faces to simulate more and less carotenoids and more and less suntan. Participants were then asked to adjust the skin colour to make the faces look as healthy as possible. Given the choice between skin colour enhanced by suntan and skin colour enhanced by carotenoids, participants preferred the carotenoid skin colour.

“Our study shows that not only do people use colour cues to judge how healthy other individuals are, but they are accurate when they make those judgements,” said Prof Perrett, who heads the Perception Lab. “This is important because evolution would favour individuals who choose to form alliances or mate with healthier individuals over unhealthy individuals.”

The study is the first to reveal such striking similarities between humans and many other species. For example, the bright yellow beaks and feathers of many birds can be thought of as adverts showing how healthy a male bird is. Females of these species prefer to mate with more brightly coloured males. This bright colouration in birds is caused by the same antioxidant carotenoids that drive the effect in humans.

“The bright yellow ornaments of birds demonstrate that the bearer has such a strong immune system and healthy reproductive system that he has plenty of these valuable antioxidant carotenoids left over to use in ornaments to advertise himself to females,” said Dr Stephen. “Our work suggests that the carotenoid colouration of human skin may represent a similar advertisement of health and fertility.”

While this study describes work in Caucasian faces, the paper also describes a study that suggests this phenomenon may exist across cultures, since similar preferences for skin yellowness were found in an African population.

(Photo: Bristol U.)

University of Bristol

CLIMATE CHANGE TO CONTINUE TO THE YEAR 3000 IN BEST CASE SCENARIOS

0 comentarios
New research indicates the impact of rising CO2 levels in the Earth’s atmosphere will cause unstoppable effects to the climate for at least the next 1000 years, causing researchers to estimate a collapse of the West Antarctic ice sheet by the year 3000, and an eventual rise in the global sea level of at least four metres.

The study, published in the Jan. 9 Advanced Online Publication of the journal Nature Geoscience, is the first full climate model simulation to make predictions out to 1000 years from now. It is based on best-case, ’zero-emissions’ scenarios constructed by a team of researchers from the Canadian Centre for Climate Modelling and Analysis (an Environment Canada research lab at the University of Victoria) and the University of Calgary.

“We created ‘what if’ scenarios,” says Dr. Shawn Marshall, Canada Research Chair in Climate Change and University of Calgary geography professor. “What if we completely stopped using fossil fuels and put no more CO2 in the atmosphere? How long would it then take to reverse current climate change trends and will things first become worse?” The research team explored zero-emissions scenarios beginning in 2010 and in 2100.

The Northern Hemisphere fares better than the south in the computer simulations, with patterns of climate change reversing within the 1000-year timeframe in places like Canada. At the same time parts of North Africa experience desertification as land dries out by up to 30 percent, and ocean warming of up to 5°C off of Antarctica is likely to trigger widespread collapse of the West Antarctic ice sheet, a region the size of the Canadian prairies.

Researchers hypothesize that one reason for the variability between the North and South is the slow movement of ocean water from the North Atlantic into the South Atlantic. “The global ocean and parts of the Southern Hemisphere have much more inertia, such that change occurs more slowly,” says Marshall. “The inertia in intermediate and deep ocean currents driving into the Southern Atlantic means those oceans are only now beginning to warm as a result of CO2 emissions from the last century. The simulation showed that warming will continue rather than stop or reverse on the 1000-year time scale.”

Wind currents in the Southern Hemisphere may also have an impact. Marshall says that winds in the global south tend to strengthen and stay strong without reversing. “This increases the mixing in the ocean, bringing more heat from the atmosphere down and warming the ocean.”

Researchers will next begin to investigate more deeply the impact of atmosphere temperature on ocean temperature to help determine the rate at which West Antarctica could destabilize and how long it may take to fully collapse into the water.

University of Calgary

DWARF GALAXY HARBORS SUPERMASSIVE BLACK HOLE

0 comentarios

The surprising discovery of a supermassive black hole in a small nearby galaxy has given astronomers a tantalizing look at how black holes and galaxies may have grown in the early history of the Universe. Finding a black hole a million times more massive than the Sun in a star-forming dwarf galaxy is a strong indication that supermassive black holes formed before the buildup of galaxies, the astronomers said.

The galaxy, called Henize 2-10, 30 million light-years from Earth, has been studied for years, and is forming stars very rapidly. Irregularly shaped and about 3,000 light-years across (compared to 100,000 for our own Milky Way), it resembles what scientists think were some of the first galaxies to form in the early Universe.

"This galaxy gives us important clues about a very early phase of galaxy evolution that has not been observed before," said Amy Reines, a Ph.D. candidate at the University of Virginia.

Supermassive black holes lie at the cores of all "full-sized" galaxies. In the nearby Universe, there is a direct relationship -- a constant ratio -- between the masses of the black holes and that of the central "bulges" of the galaxies, leading them to conclude that the black holes and bulges affected each others' growth.

Two years ago, an international team of astronomers found that black holes in young galaxies in the early Universe were more massive than this ratio would indicate. This, they said, was strong evidence that black holes developed before their surrounding galaxies.

"Now, we have found a dwarf galaxy with no bulge at all, yet it has a supermassive black hole. This greatly strengthens the case for the black holes developing first, before the galaxy's bulge is formed," Reines said.

Reines, along with Gregory Sivakoff and Kelsey Johnson of the University of Virginia and the National Radio Astronomy Observatory (NRAO), and Crystal Brogan of the NRAO, observed Henize 2-10 with the National Science Foundation's Very Large Array radio telescope and with the Hubble Space Telescope. They found a region near the center of the galaxy that strongly emits radio waves with characteristics of those emitted by super-fast "jets" of material spewed outward from areas close to a black hole.

They then searched images from the Chandra X-Ray Observatory that showed this same, radio-bright region to be strongly emitting energetic X-rays. This combination, they said, indicates an active, black-hole-powered, galactic nucleus.

"Not many dwarf galaxies are known to have massive black holes," Sivakoff said.

While central black holes of roughly the same mass as the one in Henize 2-10 have been found in other galaxies, those galaxies all have much more regular shapes. Henize 2-10 differs not only in its irregular shape and small size but also in its furious star formation, concentrated in numerous, very dense "super star clusters."

"This galaxy probably resembles those in the very young Universe, when galaxies were just starting to form and were colliding frequently. All its properties, including the supermassive black hole, are giving us important new clues about how these black holes and galaxies formed at that time," Johnson said.

The astronomers reported their findings in the January 9 online edition of Nature, and at the American Astronomical Society's meeting in Seattle, WA.

(Photo: Reines, et al., David Nidever, NRAO/AUI/NSF, NASA)

National Radio Astronomy Observatory

CONSUMERS PREFER PRODUCTS WITH FEW, AND MOSTLY MATCHING, COLORS

0 comentarios

Most people like to play it safe when combining colors for an article of clothing or outfit, a new study suggests.

When consumers were asked to choose colors for seven different parts of an athletic shoe, they tended to pick identical or similar colors for nearly every element.

They usually avoided contrasting or even moderately different color combinations.

A red and yellow athletic shoe? Not going to happen. Blue and grey? That’s more like it.

This is one of the first studies to show how consumers would choose to combine colors in a realistic shopping situation, said Xiaoyan Deng, lead author of the study and assistant professor of marketing at Ohio State University’s Fisher College of Business.

The results support the theory that people like their color combinations to be relatively simple and coherent, rather than complex and distinct

“Most people like to match colors very closely,” Deng said. “The further the distance between two colors, the less likely people are to choose them together.”

However, there was one exception. A large minority of people chose to highlight a relatively small signature part of the shoe with a contrasting color far from the colors used in other elements.

Overall, though, the study showed that people prefer a simple design with few colors. While participants could choose from up to 16 colors for different parts of the shoe, the average person only used about four colors on the entire shoe they designed.

“Using a small number of colors simplifies the final design and reduces the effort it takes to design the shoe,” Deng said.

Deng conducted the study with Sam Hui of the Stern School of Business at New York University and J. Wesley Hutchison of the Wharton School at the University of Pennsylvania. It was published in a recent issue of the Journal of Consumer Psychology.

The study is important, Deng said, because it is one of the first to show, from a marketing perspective, people’s preferences for color combinations. Most other research on color preferences has taken a psychological perspective and simply asked people whether they thought two color chips would go well together.

“We had a very realistic situation in the study where consumers could clearly show how they would combine colors in real life,” Deng said.

The study involved 142 participants who agreed to go to the publicly available NIKEiD website and create a Nike “shox” shoe for themselves. At the site, they choose colors for seven elements of the shoe: the base, secondary, swoosh, accent, lace, lining and shox. For each element, they could choose between six to 12 colors.

The researchers analyzed the color choices made by the participants and measured the similarity of chosen colors based on a widely accepted “color space” model.

Results showed there was a strong tendency to use identical colors in more than one of the seven different elements of the shoe, Deng said. When the participants did use different colors, they were almost always very closely related. For instance, “ice blue” might be combined with “twilight blue.”

But a large minority of people did choose to highlight one element of the shoe by making it a color that was unrelated to the others used, offering a strong contrast. Often, people chose this contrasting color for the “shox” element – columns in the heel and mid-section of the shoe that provide cushioning while running.

These shox are a unique component of athletic shoes and a signature component of this Nike product line.

“It seems that some consumers wanted this signature part of the shoe to really stand out from the rest,” Deng said. “It may be that they saw the rest of the shoe as a background for this one contrasting color. But we need to study that more.”

Deng said it was significant that consumers used only about four different colors in the shoe. The researchers calculated that they would expect consumers to use 5.48 colors per shoe, based on the conditions in this study.

“We found that consumers preferred to use just a small palette of colors in their shoe and closely matched colors within this palette,” she said.

But does this study really capture the participants’ general feelings about color combinations, or are the results only applicable to these self-designed shoes?

To test this, the researchers asked participants to rate how much they liked four Nike-designed shoes available on the website.

The researchers then created a “color coordination index” for each Nike-designed shoe that allowed them to relate the level of similarity between colors of a specific Nike-designed shoe to participants’ shoe preferences.

The results showed that there was a strong association between the color coordination index and the liking for Nike-designed shoes. This suggests the study really did reveal how participants liked to combine colors, Deng said.

Deng said the findings suggest that Nike may be offering more color combinations for each element of the shoe than consumers really need.

“If a consumer chooses a reddish color for one element of the shoe, he or she will probably only use colors closely related to red for the rest of the shoe,” she said.

“However, it is not the case that you can offer the same small palette of colors for all consumers. Each consumer may have a different idea of what color they want to emphasize. But once they make that choice, their palette tends to be restricted.”

(Photo: OSU)

Ohio State University

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com