Thursday, December 2, 2010


0 comentarios

The National Nuclear Security Administration's National Ignition Facility (NIF) has set world records for neutron yield from laser-driven fusion fuel capsules and laser energy delivered to inertial confinement fusion (ICF) targets.

The neutron yield record was set on Sunday, Oct. 31, when the NIF team fired 121 kilojoules of ultraviolet laser light into a glass target filled with deuterium and tritium (DT) gas. The shot produced approximately 3 x 1014 (300 trillion) neutrons, the highest neutron yield to date by an inertial confinement fusion facility. Neutrons are produced when the nuclei of deuterium and tritium (isotopes of hydrogen) fuse, creating a helium nucleus and releasing a high-energy neutron.

On Tuesday, Nov. 2, the team fired 1.3 megajoules of ultraviolet light into a cryogenically cooled cylinder called a hohlraum containing a surrogate fusion target known as a symmetry capsule, or symcap. This was the highest-energy laser shot and was the first test of hohlraum temperature and capsule symmetry under conditions designed to produce fusion ignition and energy gain. Preliminary analysis indicated that the hohlraum absorbed nearly 90 percent of the laser power and reached a peak radiation temperature of 300 electron volts (about six million degrees Fahrenheit) -- making this the highest X-ray drive energy ever achieved in an indirect drive ignition target.

The experiments followed closely on the heels of NIF's first integrated ignition experiment on Sept. 29, which demonstrated the integration of the complex systems required for an ignition campaign including a target physics design, the laser, target fabrication, cryogenic fuel layering and target positioning, target diagnostics, control and data systems, and tritium handling and personnel and environmental protection systems. In that shot, one megajoule of ultraviolet laser energy was fired into a cryogenically layered capsule filled with a mixture of tritium, hydrogen and deuterium (THD), tailored to enable the most comprehensive physics results.

"The results of all of these experiments are extremely encouraging," said NIF Director Ed Moses, "and they give us great confidence that we will be able to achieve ignition conditions in deuterium-tritium fusion targets."

NIF, the world's largest and highest-energy laser system, is located at Lawrence Livermore National Laboratory (LLNL) in California. NIF researchers are currently conducting a series of "tuning" shots to determine the optimal target design and laser parameters for high-energy ignition experiments with fusion fuel in the coming months.

When NIF's 192 powerful lasers fire, more than one million joules of ultraviolet energy are focused into the ends of the pencil-eraser-sized hohlraum, a technique known as "indirect drive." The laser irradiation generates a uniform bath of X-rays inside the hohlraum that causes the hydrogen fuel in the target capsule to implode symmetrically, resulting in a controlled thermonuclear fusion reaction. The reaction happens so quickly, in just a few billionths of a second, that the fuel's inertia prevents it from blowing apart before fusion "burn" spreads through the capsule -- hence the term inertial confinement fusion. In ignition experiments, more energy will be released than the amount of laser energy required to initiate the reaction, a condition known as energy gain. NIF researchers expect to achieve a self-sustaining fusion burn reaction with energy gain within the next two years.

The experimental program to achieve fusion and energy gain, known as the National Ignition Campaign, is a partnership among LLNL, Los Alamos National Laboratory, the Laboratory for Laser Energetics at the University of Rochester, General Atomics of San Diego, Calif., Sandia National Laboratories, Massachusetts Institute of Technology, and other national and international partners.

In the Oct. 31 "neutron calibration" shot, NIF's lasers were fired directly onto a DT-filled glass target, as opposed to the indirect-drive geometry used in NIF's TD and THD experiments. The purpose of the shot was to calibrate and test the performance of NIF's extensive neutron diagnostic equipment.

The capsule used in the Nov. 2 symcap experiment has the same two-millimeter outer diameter doped shell as an ignition capsule, but replaces the DT fuel layer with an equivalent mass of material from the outer shell to mimic the capsule's hydrodynamic behavior. Achieving a highly symmetrical compression of the fuel capsule is a key requirement for NIF to achieve its goal of fusion ignition.

(Photo: LLNL)

Lawrence Livermore National Laboratory (LLNL)


0 comentarios
The eerie music in the movie theater swells; the roller coaster crests and begins its descent; something goes bump in the night. Suddenly, you're scared: your heart thumps, your stomach clenches, your throat tightens, your muscles freeze you in place. But fear doesn't come from your heart, your stomach, your throat, or your muscles. Fear begins in your brain, and it is there—specifically in an almond-shaped structure called the amygdala—that it is controlled, processed, and let out of the gate to kick off the rest of the fear response.

In this week's issue of the journal Nature, a research team led by scientists at the California Institute of Technology (Caltech) has taken an important step toward understanding just how this kickoff occurs by beginning to dissect the neural circuitry of fear. In their paper, these scientists—led by David J. Anderson, the Benzer Professor of Biology at Caltech and a Howard Hughes Medical Institute investigator—describe a microcircuit in the amygdala that controls, or "gates," the outflow of fear from that region of the brain.

The microcircuit in question, Anderson explains, contains two subtypes of neurons that are antagonistic—have opposing functions—and that control the level of fear output from the amygdala by acting like a seesaw.

"Imagine that one end of a seesaw is weighted and normally sits on a garden hose, preventing water—in this analogy, the fear impulse—from flowing through it," says Anderson. "When a signal that triggers a fear response arrives, it presses down on the opposite end of the seesaw, lifting the first end off the hose and allowing fear, like water, to flow." Once the flow of fear has begun, that impulse can be transmitted to other regions of the brain that control fearful behavior, such as freezing in place.

"Now that we know about this 'seesaw' mechanism," he adds, "it may someday provide a new target for developing more specific drugs for treating fear-based psychiatric illnesses like post-traumatic stress disorder, phobias, or anxiety disorders."

The key to understanding this delicate mechanism, Anderson says, was in uncovering "markers"—genes that would identify and allow for the scientists to discriminate between the different neuronal cell types in the amygdala. Anderson's group, led by postdoctoral fellow Wulf Haubensak, found its marker in a gene that encodes an enzyme known as protein kinase C-delta (PKCδ). PKCδ is expressed in about half the neurons within a subdivision of the amygdala's central nucleus, the part of the amygdala that controls fear output.

Along with fellow postdocs Prabhat Kunwar and Haijiang Cai, Haubensak was able to fluorescently tag neurons in which the protein kinase is expressed; this allowed the researchers to map the connections of these neurons, as well as to monitor and manipulate their electrical activity.

The studies, Anderson says, "revealed that PKCδ+ neurons form one end of a seesaw, by making connections with another population of neurons in the central nucleus that do not express the enzyme, which are called PKCδ− neurons." They also showed that the kinase-positive neurons inhibit outflow from the amygdala—proving that they act as the end of the seesaw that rests on the garden hose.

Still, a key question remained: What happens to the seesaw during exposure to a fear-eliciting signal? Anderson and his colleagues hypothesized that the fear signal would push down on the opposite end of the seesaw from the one formed by the PKCδ+ neurons, removing the crimp from the garden hose and allowing the fear signal to flow. But how to test this idea?

Enter neurophysiologist Andreas Lüthi and his student Stephane Ciocchi, from the Friedrich Miescher Institute in Basel, Switzerland. In work done independently from that of the Anderson lab, Lüthi and Ciocchi had managed to record electrical signals from the amygdala during exposure to fear-inducing stimuli. Interestingly, they had found two types of neurons that responded in opposite ways to the fear-inducing stimulus: one type increased its activity, while the other type decreased its activity. Like Anderson, they had begun to think that these neurons formed a seesaw that controls fear output from the amygdala.

And so the two teams joined forces to determine whether the cells Lüthi had been studying corresponded to the PKCδ+ and PKCδ− cells Anderson's lab had isolated. In what Anderson refers to as a "sophisticated experiment," the two teams performed electrophysiological recordings while simultaneously turning the PKCδ+ neurons on or off using a genetic method developed by Henry Lester, Caltech's Bren Professor of Biology.

The results of the experiment were "gratifyingly clear," says Anderson. The cells that decreased their activity in the face of fear-inducing stimuli clearly corresponded to the PKCδ+ neurons Anderson's lab had isolated, while those that increased their activity corresponded to the PKCδ− neurons.

"These results supported the hypothesis that PKCδ+ neurons were indeed at the opposite end of the seesaw from the one that the fear signal 'presses down' on, consistent with the finding that PKCδ+ neurons crimp the 'fear hose,'" says Anderson.

The marriage of molecular biology and electrophysiology created by the collaboration between Anderson's and Lüthi's laboratories has revealed properties of the fear circuit that could not have been discovered in any other way, Anderson says. "The functional geography of the brain is organized like that of the world," he notes. "It's divided into continents, countries, states, towns and cities, neighborhoods and houses; the houses are analogous to the different types of neurons. Previously, it had only been possible to dissect the amygdala at the level of different towns, or of neighborhoods at best. Now, using these new genetic techniques, we are finally down to the level of the houses."

And that, he adds, is what will make it possible for us to fully understand the networks of communication that exist between neurons within a subdivision of the brain, as well as between subdivisions and different areas. "While these studies shed light on only a small part of the picture, they are an important step in that direction," Anderson says.

California Institute of Technology (Caltech)


0 comentarios

University of California, Berkeley, scientists have found a way to overcome one of the main limitations of ultrasound imaging – the poor resolution of the picture.

Everyone who has had an ultrasound, including most pregnant women, is familiar with the impressionistic nature of the images. One of the limits to the detail obtainable with sonography is the frequency of the sound. The basic laws of physics dictate that the smallest objects you can "see" are about the size of the wavelength of the sound waves. For ultrasound of deep tissues in the body, for example, the sound waves are typically 1-5 megahertz – far higher than what humans can hear – which imposes a resolution limit of about a millimeter.

In a paper appearing online this week in the journal Nature Physics, physicists at UC Berkeley and Universidad Autonoma de Madrid in Spain demonstrate how to capture the evanescent waves bouncing off an object to reconstruct detail as small as one-fiftieth of the wavelength of the sound waves. Evanescent sound waves are vibrations near the object that damp out within a very short distance, as opposed to propagating waves, which can travel over a long distance.

"With our device, we can pick up and transmit the evanescent waves, which contain a substantial fraction of the ultra-subwavelength information from the object, so that we can realize super-resolution acoustic imaging," said first author Jie Zhu, a post-doctoral fellow in the Center for Scalable and Integrated NanoManufacturing (SINAM), a National Science Foundation-funded Nano-scale Science and Engineering Center at UC Berkeley.

The researchers refer to their device for capturing evanescent waves as a three-dimensional, holey-structured metamaterial. It consists of 1,600 hollow copper tubes bundled into a 16 centimeter (6 inch) bar with a square cross-section of 6.3 cm (2.5 inches). Placed close to an object, the structure captures the evanescent waves and pipes them through to the opposite end.

In a practical device, Zhu said, the metamaterial could be mounted on the end of an ultrasound probe to vastly improve the image resolution. The device would also improve underwater sonography, or sonar, as well as non-destructive evaluation in industry applications.

"For ultrasound detection, the image resolution is generally in the millimeter range," said co-author Xiaobo Yin. "With this device, resolution is only limited by the size of the holes."

In the researchers' experiments, the holes in the copper tubes were about a millimeter in diameter. Using acoustic waves of about 2 kHz, the resolution of an image would normally be limited to the wavelength, or 200 millimeters. With their holey-structured metamaterial, they can resolve the feature size as small as 4 mm, or one-fiftieth of a wavelength.

"Without the metamaterial, it would be impossible to detect such a deep sub-wavelength object at all," Yin said.

The work was performed in the laboratory of Xiang Zhang, the Ernest S. Kuh Endowed Chaired Professor in the Department of Mechanical Engineering at UC Berkeley and the director of SINAM. The experiments were based on theoretical predictions of the group led by Professor Francisco J. García-Vidal of the Universidad Autonoma de Madrid. Other co-authors of the paper are J. Christensen of the Universidad Autonoma de Madrid, L. Martin-Moreno of CSIC-Universidad de Zaragoza in Spain, J. Jung from the Aalborg University in Denmark, and L. Fok of SINAM.

The work was funded by the U.S. Office of Naval Research and the Spanish Ministry of Science.

(Photo: Jie Zhu/UC Berkeley, Johan Christensen/Universidad Autonoma de Madrid)

University of California, Berkeley


0 comentarios

Scientists have developed a recipe for manipulating the speed of light as it passes over an object, making it theoretically possible to ‘cloak‘ the object’s movement so that an observer doesn’t notice, according to a paper in the Journal of Optics.

The study, by researchers from Imperial College London, involves a new class of materials called metamaterials, which can be artificially engineered to distort light or sound waves. With conventional materials, light typically travels along a straight line, but with metamaterials, scientists can exploit a wealth of additional flexibility to create undetectable blind spots. By deflecting certain parts of the electromagnetic spectrum, an image can be altered or made to look like it has disappeared.

Previously, a team led by Professor Sir John Pendry at Imperial College London showed that metamaterials could be used to make an optical invisibility cloak. Now, a team led by Professor Martin McCall has mathematically extended the idea of a cloak that conceals objects to one that conceals events.

“Light normally slows down as it enters a material, but it is theoretically possible to manipulate the light rays so that some parts speed up and others slow down,” says McCall, from the Department of Physics at Imperial College London. When light is ‘opened up’ in this way, rather than being curved in space, the leading half of the light speeds up and arrives before an event, whilst the trailing half is made to lag behind and arrives too late. The result is that for a brief period the event is not illuminated, and escapes detection. Once the concealed passage has been used, the cloak can then be ‘closed’ seamlessly.

Such a space-time cloak would open up a temporary corridor through which energy, information and matter could be manipulated or transported undetected. “If you had someone moving along the corridor, it would appear to a distant observer as if they had relocated instantaneously, creating the illusion of a Star-Trek transporter,” says McCall. “So, theoretically, this person might be able to do something and you wouldn’t notice!”

While using the spacetime cloak to make people move undetected is still science fiction, there are many serious applications for the new research, which was funded by the Engineering and Physical Sciences Research Council (EPSRC) and the Leverhulme Trust. Co-author Dr Paul Kinsler developed a proof of concept design using customised optical fibres, which would enable researchers to use the event cloak in signal processing and computing. A given data channel could for example be interrupted to perform a priority calculation on a parallel channel during the cloak operation. Afterwards, it would appear to external parts of the circuit as though the original channel had processed information continuously, so as to achieve ‘interrupt-without-interrupt’.

Alberto Favaro, who also worked on the project, explains: “Imagine computer data moving down a channel to be like a highway full of cars. You want to have a pedestrian crossing without interrupting the traffic, so you slow down the cars that haven’t reached the crossing, while the cars that are at or beyond the crossing get sped up, which creates a gap in the middle for the pedestrian to cross. Meanwhile an observer down the road would only see a steady stream of traffic.” One issue that cropped up during their calculations was to speed up the transmitted data without violating the laws of relativity. Favaro solved this by devising a clever material whose properties varied in both space and time, allowing the cloak to be formed.

“We’re sure that there are many other possibilities opened up by our introduction of the concept of the spacetime cloak,’ says McCall, “but as it’s still theoretical at this stage we still need to work out the concrete details for our proposed applications.”

Metamaterials is an expanding field of science, with a vast array of potential uses, spanning defence, security, medicine, data transfer and computing. Many ordinary household devices that work using electromagnetic fields could be made more cheaply or to work at higher speeds. Metamaterials could also be used to control other types of waves as well as light, such as sound or water waves, opening up potential applications for protecting coastal or offshore installations, or even engineering buildings to withstand earthquake waves.

(Photo: ICL)

Imperial College London


0 comentarios
People spend 46.9 percent of their waking hours thinking about something other than what they're doing, and this mind-wandering typically makes them unhappy. So says a study that used an iPhone web app to gather 250,000 data points on subjects' thoughts, feelings, and actions as they went about their lives.

The research, by psychologists Matthew A. Killingsworth and Daniel T. Gilbert of Harvard University, is described in the journal Science.

"A human mind is a wandering mind, and a wandering mind is an unhappy mind," Killingsworth and Gilbert write. "The ability to think about what is not happening is a cognitive achievement that comes at an emotional cost."

Unlike other animals, humans spend a lot of time thinking about what isn't going on around them: contemplating events that happened in the past, might happen in the future, or may never happen at all. Indeed, mind-wandering appears to be the human brain's default mode of operation.

To track this behavior, Killingsworth developed an iPhone web app that contacted 2,250 volunteers at random intervals to ask how happy they were, what they were currently doing, and whether they were thinking about their current activity or about something else that was pleasant, neutral, or unpleasant.

Subjects could choose from 22 general activities, such as walking, eating, shopping, and watching television. On average, respondents reported that their minds were wandering 46.9 percent of time, and no less than 30 percent of the time during every activity except making love.

"Mind-wandering appears ubiquitous across all activities," says Killingsworth, a doctoral student in psychology at Harvard. "This study shows that our mental lives are pervaded, to a remarkable degree, by the non-present."

Killingsworth and Gilbert, a professor of psychology at Harvard, found that people were happiest when making love, exercising, or engaging in conversation. They were least happy when resting, working, or using a home computer.

"Mind-wandering is an excellent predictor of people's happiness," Killingsworth says. "In fact, how often our minds leave the present and where they tend to go is a better predictor of our happiness than the activities in which we are engaged."

The researchers estimated that only 4.6 percent of a person's happiness in a given moment was attributable to the specific activity he or she was doing, whereas a person's mind-wandering status accounted for about 10.8 percent of his or her happiness.

Time-lag analyses conducted by the researchers suggested that their subjects' mind-wandering was generally the cause, not the consequence, of their unhappiness.

"Many philosophical and religious traditions teach that happiness is to be found by living in the moment, and practitioners are trained to resist mind wandering and to 'be here now,'" Killingsworth and Gilbert note in Science. "These traditions suggest that a wandering mind is an unhappy mind."

This new research, the authors say, suggests that these traditions are right.

Killingsworth and Gilbert's 2,250 subjects in this study ranged in age from 18 to 88, representing a wide range of socioeconomic backgrounds and occupations. Seventy-four percent of study participants were American.

More than 5,000 people are now using the iPhone web app the researchers have developed to study happiness, which can be found at

Harvard University


0 comentarios
How you think about your goals—whether it’s to improve yourself or to do better than others—can affect whether you reach those goals. Different kinds of goals can also have distinct effects on your relationships with people around you, according to the authors of a paper published in Current Directions in Psychological Science, a journal of the Association for Psychological Science.

People with “mastery goals” want to improve themselves. Maybe they want to get better grades, make more sales, or land that triple toe loop. On the other hand, people with what psychologists call “performance goals” are trying to outperform others—to get a better grade than a friend or be Employee of the Year. Both kinds of goals can be useful in different contexts. But P. Marijn Poortvliet, of Tilburg University in the Netherlands, and Céline Darnon, of France’s Clermont University, are interested in the social context of these goals—what they do to your relationships.

Poortvliet’s work focuses on information exchange—whether people are open and honest when they are working together. “People with performance goals are more deceitful” and less likely to share information with coworkers, both in the laboratory and in real-world offices he has studied, Poortvliet says. “The reason is fairly obvious—when you want to outperform others, it doesn’t make sense to be honest about information.”

On the other hand, people who are trying to improve themselves are quite open, he says. “If the ultimate goal is to improve yourself, one way to do it is to be very cooperative with other people.” This can help improve the work environment, even though the people with these goals aren’t necessarily thinking about social relations. “They’re not really altruists, per se. They see the social exchange as a means toward the ends of self improvement.” Other research has found that people with these self-improvement goals are more open to hearing different perspectives, while people with a performance goal “would rather just say, ‘I’m just right and you are wrong.’”

It’s not always bad to be competitive, Poortvliet says. “For example, if you want to be the Olympic champion, of course it’s nice to have mastery goals and you should probably have mastery goals, but you definitely need performance goals because you want to be the winner and not the runner-up.”

But it’s important to think about how goals affect the social environment. “If you really want to establish constructive and long-lasting working relationships, then you should really balance the different levels of goals,” Poortvliet says—thinking not only about each person’s achievement, but also about the team as a whole.

Some people are naturally more competitive than others. But it’s also possible for managers to shift the kinds of goals people have by, for example, giving a bonus for the best employee. That might encourage people to set performance goals and compete against each other. On the other hand, it would also be possible to structure a bonus program to give people rewards based on their individual improvement over time.

Association for Psychological Science


0 comentarios

Catastrophic drought is on the near-term horizon for the capital city of Bolivia, according to new research into the historical ecology of the Andes.

If temperatures rise more than 1.5 to 2 degrees Celsius (3 to 5 degrees Fahrenheit) above those of modern times, parts of Peru and Bolivia will become a desert-like setting.

The change would be disastrous for the water supply and agricultural capacity of the two million inhabitants of La Paz, Bolivia's capital city, scientists say.

The results, derived from research funded by the National Science Foundation (NSF) and conducted by scientists affiliated with the Florida Institute of Technology (FIT), appear in the November issue of the journal Global Change Biology.

Climatologist Mark Bush of FIT led a research team investigating a 370,000-year record of climate and vegetation change in Andean ecosystems.

The scientists used fossilized pollen trapped in the sediments of Lake Titicaca, which sits on the border of Peru and Bolivia.

They found that during two of the last three interglacial periods, which occurred between 130,000-115,0000 years ago and 330,000-320,000 years ago, Lake Titicaca shrank by as much as 85 percent.

Adjacent shrubby grasslands were replaced by desert.

In each case, a steady warming occurred that caused trees to migrate upslope, just as they are doing today.

However, as the climate kept warming, the system suddenly flipped from woodland to desert.

"The evidence is clear that there was a sudden change to a much drier state," said Bush.

Scientist Sherilyn Fritz at the University of Nebraska-Lincoln showed that during these warm episodes the algae living in Lake Titicaca shifted from freshwater species to ones tolerant of salty water. Paul Baker of Duke University identified peaks of carbonate deposition.

Both point to a sudden shallowing of the lake due to evaporative loss.

An environmental reconstruction demonstrates that with moderate warming, forests moved upslope. But as that warming continued, a climate tipping point was reached.

The system was thrown into a new, drier state that halted forest expansion.

The tipping point is caused by increased evaporative loss from Lake Titicaca.

As the lake contracts, the local climate effects attributable to a large lake--doubling of rainfall, among the most important--would be lost, says Bush.

Such tipping points have been postulated by other studies, but this work allowed the researchers to state when the system will change.

Based on the growth limits of Andean forests, they defined a tipping point that was exceeded within a 1.5 to 2 degrees Celsius warming above modern conditions.

Given a rate of warming in the Peruvian Andes of about 0.3-0.5 degrees Celsius per decade, the tipping point ahead would be reached between 2040 and 2050.

"The implications would be profound for some two million people," says Paul Filmer, program director in NSF's Division of Earth Sciences. Severe drought, and a loss of stored water in lakes in the region, would reduce yields from important agricultural regions and threaten drinking water supplies.

The research suggests that limiting wildfires would help delay the worst effects of the drought.

(Photo: Mark Bush/FIT)

National Science Foundation


0 comentarios

Chemical reactions are happening all over the place all the time--on the sun, on the Earth and in our bodies. In many cases, enzymes help make these reactions occur. One family of enzymes, called cytochrome P450s (P450), is important because they help us eliminate toxins.

We know P450s are important to life of all kinds because they have been found in animals, plants, fungi and bacteria, but they are of special interest to humans because they are responsible for metabolism of about 75 percent of known pharmaceuticals.

"The reactions that P450s perform to detoxify a compound are interesting because they activate chemical bonds that are usually not reactive. Chemically speaking, this is a very difficult thing to do in a controlled way," said Michael Green of the department of chemistry at Penn State University. Green and a former student are authors of a paper describing a breakthrough in isolation of P450 compound I, an important chemical intermediate in the process of drug metabolism. This research, supported by the National Science Foundation, appears in the November 12 issue of the journal Science.

In terms of P450s, we humans aren't all the same. One person can be susceptible to poisoning by a toxin or drug more than another based on the levels of the different P450s they have in their liver, lungs and other organs. With such obvious medical and biological importance, these enzymes have been studied in great detail for many years, but exactly how they are able to catalyze these chemical reactions remains to be determined.

At the heart of the problem is P450 compound I. It is a highly reactive chemical species produced by the P450 enzyme to help metabolize a toxin or drug. Because of this extreme reactivity, compound I turns into something else before scientists have a chance to capture it. This has remained a problem for more than 40 years, prompting some to question its very existence.

"This work confirms the existence of compound I, and demonstrates that it can perform the type of chemical reactions for which P450s are known," stated Green. "Now that we can make this chemical species, we can begin to do larger scale studies to understand just how it performs this chemistry."

The research has impact on two major fronts: medicine and basic chemistry. A better understanding of both the biology and chemistry of this family of enzymes will drive research in both fields. Eventually, Green hopes these insights will help chemists understand how to better control the specificity of a given chemical reaction. This knowledge could make production of pharmaceuticals and a variety of commodity chemicals cheaper and more efficient.

(Photo: © 2010 JupiterImages Corporation)

National Science Foundation


0 comentarios

The nine-month pregnancy in humans is influenced by the structure of the placenta, according to new research into the evolution of reproduction in mammals which ends a 100-year mystery.

The study, by Durham and Reading universities was funded by BBSRC, the Natural Environment Research Council (NERC) and the Leverhulme Trust. It shows that babies grow twice as fast in the wombs of some mammals compared to others - a difference that has arisen through evolution of species. The difference in growth rates appears to be due to the structure of the placenta and the way it connects mother and baby.

The research has found that the more intimate the connection is between the tissues of the mother and the foetus, the faster the growth of the baby and the shorter the pregnancy. The findings help to explain why humans, whose placentas do not form the complex web-like structure seen in animals such as dogs and leopards, have relatively lengthy pregnancies.

The structure of the placenta is surprisingly different amongst mammal species although it serves the same basic function in all of them. The scientists say that, despite speculation, the reasons for this variation have been a mystery for more than 100 years, until now.

The researchers, whose findings are published in the academic journal American Naturalist this week, analysed data on 109 mammal species showing for the first time that the structure of the placenta influences pregnancy duration in mammals. The scientists say that the placenta in some mammals is highly 'folded' creating a larger surface area, increasing the rate at which nutrients are passed from mother to baby.

This sort of folding is a common way in which evolution has solved the problem of increasing surface area in animal bodies. It is seen in many tissues where a large surface area needs to be packed into a small space, including the lungs, intestines and cortex of the brain.

Females of all mammal species develop placentas when they conceive, including bats, whales, and elephants. The placenta connects the developing foetus to the lining of the womb to allow nutrient uptake, waste elimination, and gas exchange via the mother's blood supply.

The research team studied the length of pregnancy, structure of placenta, and size of offspring in mammals, and examined how these characteristics have changed during the evolution of mammals. They found that, despite the placenta essentially having the same function in all mammals, there were some striking structural differences.

Previously, the extent to which there is direct contact between the mother's blood and the placenta was thought to reflect an evolutionary arms-race between mother and baby with both battling for 'control' over how much food is given and received. In this conflict, it is believed that the mother needs to reserve some of her resources for her other offspring but the foetus 'demands' more to fuel its growth.

Lead author, Dr Isabella Capellini, says "This study shows that it is not necessarily the contact with maternal blood which determines speed of growth, but the extent to which the tissues of mother and baby are "interlocked", or folded, with one another.

"In humans, the placenta has simple finger-like branches with a relatively limited connection between the mother's tissues and those of the foetus, whereas in leopards, for example, it forms a complex web of interconnections that create a larger surface area for the exchange of nutrients."

Co-author Professor Robert Barton from Durham University explained: "Parent-offspring conflict is universal. From the moment of conception, the physiologies of mother and baby are adapted to achieve slightly different goals in terms of how fast the baby grows.

"Because we found no differences in the size of the babies when they are born, it seems that the outcome of this conflict is a kind of equilibrium in which faster growth is offset by a shorter pregnancy."

Understanding how differences between species evolve and what combination of pressures and conflicts gives rise to certain physiological features can help us to appreciate issues as diverse as economics, farming and biodiversity.

(Photo: PNAS)





Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com