Wednesday, May 5, 2010

NEW GENE FOR HAIR LOSS IDENTIFIED

0 comentarios

A newly identified gene connected to hair growth may inform future treatments for male pattern baldness, says a team of researchers from Rockefeller, Columbia and Stanford Universities. The scientists found that the gene, called APCDD1, causes a type of progressive hair loss known as hereditary hypotrichosis simplex, which miniaturizes hair follicles, replacing thick hair with thinner, finer strands akin to peach fuzz. The findings were published in Nature.

The team made their discovery by analyzing genetic data from Pakistani and Italian families with hereditary hypotrichosis simplex. They found a common mutation in the APCDD1 gene, which is located in a specific region on chromosome 18 that has been implicated in previous studies in other forms of hair loss, including androgenetic alopecia and alopecia areata, hinting at a broader role in hair follicle biology.

The researchers, including Ali H. Brivanlou, Robert and Harriet Heilbrunn Professor and head of Rockefeller’s Laboratory of Molecular Vertebrate Embryology, and postdoc Alin Vonica, found that APCDD1 inhibits a signaling pathway that has been shown to control hair growth in mice. But, until now, the pathway, known as the Wnt signaling pathway, did not appear to be involved in human hair loss. The discovery that manipulating the Wnt pathway may have an effect on human hair follicle growth provides evidence that hair growth patterns in humans and in mice are more similar than previously believed.

“The identification of this gene underlying hereditary hypotrichosis simplex has afforded us an opportunity to gain insight into the process of hair follicle miniaturization, which is most commonly observed in male pattern hair loss or androgenetic alopecia,” says Angela M. Christiano, professor of dermatology and genetics and development at Columbia University Medical Center, and lead author of the study. “It is important to note that while these two conditions share the same physiologic process, the gene we discovered for hereditary hypotrichosis does not explain the complex process of male pattern baldness.”

“After Angela landed on a molecular component that they identified as causal to a state of disease, Alin and I determined that the appropriate biochemical test could be easily accomplished in as simple a biological system as the frog,” says Brivanlou. “It was the molecular dissection of APCDDI function in early Xenopus embryogenesis that hinted the first molecular function within the context of the Wnt pathway. The collaborative team then confirmed it.”

The discovery that manipulating the Wnt pathway may have an effect on human hair follicle growth suggests a new approach that could be more broadly applicable than therapies on the market today. “Unlike commonly available treatments for hair loss that involve blocking hormonal pathways, treatments involving the Wnt pathway would be non-hormonal, which may enable many more people suffering from hair loss to receive such therapies,” Christiano says.

The researchers are now working to understand the complex genetic causes of other forms of hair loss including alopecia areata, with the hope of eventually developing new, effective treatments for these conditions. “This work represents what molecular genetics and developmental biology in combination can contribute to the resolution of a molecular mechanism of this dramatic developmental disease,” Brivanlou says.

(Photo: Rockefeller U.)

Rockefeller University

"MISSING" HEAT MAY AFFECT FUTURE CLIMATE CHANGE

0 comentarios

Current observational tools cannot account for roughly half of the heat that is believed to have built up on Earth in recent years, according to a "Perspectives" article in the journal Science.

Scientists at the National Center for Atmospheric Research (NCAR) in Boulder, Colo., warn that satellite sensors, ocean floats, and other instruments are inadequate to track this "missing" heat, which may be building up in the deep oceans or elsewhere in the climate system.

"The heat will come back to haunt us sooner or later," says NCAR scientist Kevin Trenberth, the article's lead author.

"The reprieve we've had from warming temperatures in the last few years will not continue. It is critical to track the build-up of energy in our climate system so we can understand what is happening and predict our future climate."

The authors suggest that last year's rapid onset of El Niño, the periodic event in which upper ocean waters across much of the tropical Pacific Ocean become significantly warmer, may be one way in which the solar energy has reappeared.

The research was supported by the National Science Foundation (NSF), NCAR's sponsor, and by NASA.

"The flow of energy through the climate system is a key issue in understanding climate change," says Eric DeWeaver, program director in NSF's Division of Atmospheric and Geospace Sciences, which funds NCAR. "It poses a major challenge to our observing systems."

Trenberth and his co-author, NCAR scientist John Fasullo, focused on a central mystery of climate change.

Whereas satellite instruments indicate that greenhouse gases are continuing to trap more solar energy, or heat, scientists since 2003 have been unable to determine where much of that heat is going.

Either the satellite observations are incorrect, says Trenberth, or, more likely, large amounts of heat are penetrating to regions that are not adequately measured, such as the deepest parts of the oceans.

Compounding the problem, Earth's surface temperatures have largely leveled off in recent years. Yet melting glaciers and Arctic sea ice, along with rising sea levels, indicate that heat is continuing to have profound effects on the planet.

Trenberth and Fasullo explain that it is imperative to better measure the flow of energy through Earth's climate system.

For example, any geoengineering plan to artificially alter the world's climate to counter global warming could have inadvertent consequences, which may be difficult to analyze unless scientists can track heat around the globe.

Improved analysis of energy in the atmosphere and oceans can also help researchers better understand and possibly even anticipate unusual weather patterns, such as the cold outbreaks across much of the United States, Europe, and Asia over the past winter.
As greenhouse gases accumulate in the atmosphere, satellite instruments show a growing imbalance between energy entering the atmosphere from the sun and energy leaving from Earth's surface. This imbalance is the source of long-term global warming.

But tracking the growing amount of heat on Earth is far more complicated than measuring temperatures at the planet's surface.

The oceans absorb about 90 percent of the solar energy that is trapped by greenhouse gases. Additional amounts of heat go toward melting glaciers and sea ice, as well as warming the land and parts of the atmosphere.

Only a tiny fraction warms the air at the planet's surface.

Satellite measurements indicate that the amount of greenhouse-trapped solar energy has risen over recent years while the increase in heat measured in the top 3,000 feet of the ocean has stalled.

Although it is difficult to quantify the amount of solar energy with precision, Trenberth and Fasullo estimate that, based on satellite data, the amount of energy build-up appears to be about 1.0 watts per square meter or higher, while ocean instruments indicate a build-up of about 0.5 watts per square meter.

That means about half the total amount of heat is unaccounted for.

A percentage of the missing heat could be illusory, the result of imprecise measurements by satellites and surface sensors or incorrect processing of data from those sensors, the authors say.

Until 2003, the measured heat increase was consistent with computer model expectations. But a new set of ocean monitors since then has shown a steady decrease in the rate of oceanic heating, even as the satellite-measured imbalance between incoming and outgoing energy continues to grow.

Some of the missing heat appears to be going into the observed melting of ice sheets in Greenland and Antarctica, as well as Arctic sea ice.

Much of the missing heat may be in the ocean. Some heat increase can be detected between depths of 3,000 and 6,500 feet (about 1,000 to 2,000 meters), but more heat may be deeper still beyond the reach of ocean sensors.

Trenberth and Fasullo call for additional ocean sensors, along with more systematic data analysis and new approaches to calibrating satellite instruments, to help resolve the mystery.

The Argo profiling floats that researchers began deploying in 2000 to measure ocean temperatures, for example, are separated by about 185 miles (300 kilometers) and take readings only about once every 10 days from a depth of about 6,500 feet (2,000 meters) up to the surface.

Plans are underway to have a subset of these floats go to greater depths.

"Global warming at its heart is driven by an imbalance of energy: more solar energy is entering the atmosphere than leaving it," Fasullo says.

"Our concern is that we aren't able to entirely monitor or understand the imbalance. This reveals a glaring hole in our ability to observe the build-up of heat in our climate system."

(Photo: NASA)

National Science Foundation

NEUTRINOS: CLUES TO THE MOST ENERGETIC COSMIC RAYS

0 comentarios

We’re constantly being peppered by showers of debris from cosmic rays colliding with atoms in the atmosphere. Cosmic rays aren’t actually rays, of course, they’re particles; ninety percent are protons, the nuclei of hydrogen atoms, and most of the rest are heavier nuclei like iron. Some originate from our own sun but most come from farther off, from the Milky Way or beyond.

“The most energetic cosmic rays are the rarest, and they pose the biggest mystery,” says Spencer Klein of Berkeley Lab’s Nuclear Science Division. He compares the energy of an ultra-high-energy (UHE) cosmic ray to a well-hit tennis ball or a boxer’s punch – all packed into a single atomic nucleus.

“If they’re protons, they have about 40 million times the energy of the protons accelerated at the Large Hadron Collider,” Klein says. “With present technology we’d need to build an accelerator around the sun to produce protons that energetic. Not only do we not know how these cosmic accelerators work, we don’t even know where they are.”

Being electrically charged, even the most energetic cosmic rays are forced to bend when they traverse interstellar magnetic fields, so it’s not possible to extrapolate where they came from by looking back along their paths when they arrive on Earth.

Yet they can’t come from too far away. Klein explains that because cosmic rays lose energy by plowing into the photons of the cosmic microwave background as they travel, “the ones that we observe must come from the ‘local’ universe, within about 225 million light years of Earth. This sounds like a long distance, but, on cosmic scales, it isn’t very far.”

In all that volume of “nearby” space, sources capable of producing such high-energy nuclei have not been clearly identified. One clue to the origin of the highest-energy cosmic rays is the neutrinos they produce when they interact with the very cosmic microwave photons that slow them down.

“Neutrinos have important advantages as observational tools,” says Klein. “The only way they interact is through the weak interaction, so they aren’t deflected by magnetic fields in flight, and they easily slip through dense matter like stars that would stop the cosmic rays themselves.”

The flip side is that it’s quite a trick to catch neutrinos, especially those produced by rare events. Locating neutrinos produced by UHE cosmic rays needs a detector covering a huge area.

Which is how Klein came to find himself tent-camping on the Ross Ice Shelf last December (the middle of summer in Antarctica), along with his colleague Thorsten Stezelberger of the Lab’s Engineering Division and camp manager Martha Story from the Berg Field Center, a support service at McMurdo Station, the main U.S. base in Antarctica. Klein and Stezelberger were setting up a prototype station for the proposed ARIANNA array of neutrino detectors (ARIANNA stands for the Antarctic Ross Ice Shelf Antenna Neutrino Array).

Unlike such neutrino detectors as SNO in Canada, Daya Bay in China, Super-Kamiokande in Japan, or IceCube, the huge neutrino telescope under construction deep in the ice at the South Pole, ARIANNA doesn’t need miles of rock or the Earth itself to filter out background events. That’s because ARIANNA will be looking for an unusual kind of neutrino signal known as the Askaryan effect.

ARIANNA will observe the shower of electrons, positrons, and other particles produced when a neutrino interacts in the ice below the ARIANNA detectors. In 1962, Gurgen Askaryan, an Armenian physicist, pointed out that these showers contain more electrons than positrons, so have a net electric charge. When a shower develops in ice, this moving charge is an electrical current which produces a powerful pulse of radio waves, emitted in a cone around the neutrino direction.

The energy shed by particles moving faster than the speed of light in a medium like glass or water (light moves through water at only three-quarters of its speed in vacuum) is called Cherenkov radiation, and is perhaps most familiar as the blue glow made by fast-moving electrons in a pool surrounding a nuclear reactor. The same visible-light-wavelength Cherenkov radiation is used to detect charged-particle events created by neutrinos in detectors like IceCube.

Instead of optical wavelengths, ARIANNA observes Cherenkov radiation at radio wavelengths; the strength of the radio signal is proportional to the square of the energy of the neutrino that gave rise to it. To capture these signals, ARIANNA will use radio antennas buried in the snow on top of the ice.

The Ross Ice Shelf makes an ideal component of the ARIANNA detector – not least because the interface where the ice, hundreds of meters thick, meets the liquid water below is an excellent mirror for reflecting radio waves. Signals from neutrino events overhead can be detected by looking for radio waves that have been reflected from this mirror. For neutrinos arriving horizontally, some of the radio waves will be directly detected, and some will be detected after being reflected.

As envisaged by its principal investigator, Steven Barwick of UC Irvine – who visited the Ross Ice Shelf in 2008 – ARIANNA would eventually be comprised of up to 10,000 stations covering a square expanse of ice 30 kilometers on a side.

Ten thousand stations is the eventual goal, but the first step is to see whether just one station can work. During the Antarctic summer, solar panels will provide power for the radio antennas under the snow and the internet tower that sends data back to McMurdo Station, via a repeater tower on nearby Mt. Discovery. During the long, dark winter, it’s hoped that the power will come from wind turbines or a generator.

When the temperature is mostly below freezing even summer camping is a challenge, as Klein and Stezelberger found. With all supplies brought in by helicopter, the team set up three tents for sleeping, a larger (10 foot by 20 foot) tent as a kitchen, dining room, laboratory and office, and a small tent for a toilet. Instead of tent pegs, the tents are held down by guy ropes tied to “deadman anchors.”

“For each rope, we dug a two-foot-deep hole and buried a long bamboo stake with the rope tied to it,” Stezelberger explains. “When it was taut, we refilled the hole with snow – a fair bit of work.”

On the second day the team unpacked and assembled the six-foot tall station tower, made of metal pipes anchored to plywood feet under the snow. The tower holds four solar panels, a wind turbine, and antennas for receiving time signals from global positioning satellites, and for communicating via Iridium communications satellites.

Klein, Stezelberger and Story spent the third day assembling, testing, and burying the neutrino-detecting antennas in six-foot-deep trenches in the snow. On the fourth day an internet tower – network communications were invaluable for sending data north, and for allowing people to work remotely on the station computer – was brought in by helicopter and erected by a four-person crew, who stayed for lunch. “Fortunately they brought their own,” Klein remarks. “We were wondering how we’d feed everyone with only four forks, four spoons, and four knives.”

After another week, which was mostly spent testing instruments, including bouncing radio signals off the water-ice interface, plus two days waiting for the weather to clear so that helicopters could pick them up, the team finally struck camp. After packaging their gear in slings to be picked up by subsequent flights, they climbed aboard a chopper and returned to base, leaving behind a functioning station intended to survive the oncoming winter.

Klein and Stezelberger made it back to Berkeley Lab by the last day of December. Klein, aided by UC Irvine’s Barwick and graduate student Jordan Hanson, neutrino physicist Ryan Nichol of University College London, and Lisa Gerhardt of Berkeley Lab’s Nuclear Science Division (herself recently returned from work on IceCube at the South Pole), spent the next weeks analyzing the data from the ARIANNA prototype station on the ice, as it continued to report via the internet. The stream of information included housekeeping data and scientific data in the form of antenna signals.

“Wind had generally been so calm during the week and a half we spent on the ice, we were afraid the wind generator wasn’t going to be sufficient for the station’s power needs during the winter,” Klein says. “But after we left, the wind picked up and the wind turbine started functioning, which encouraged us.”

The antenna data was also instructive, and there was a lot of it – signals from natural background noise and from man-made sources. An event every 60 seconds was the “heartbeat” pulse emitted by the station itself, which the team had set up to check the detector.

“But there were other, unexpected periodic signals, pairs separated by almost exactly six seconds, their rate varying over 24 hours,” Gerhardt says. Periodic signals strongly hint at man-made sources. “We think they’re probably from the switching of the power supplies for the internet hardware.”

Other events, aperiodic, were part of the irreducible background, including thermal noise due to molecular motion in the equipment. This set a natural limit to the detector’s performance but should be improved with better equipment.

One thing the prototype station hasn’t seen is an energetic neutrino, and Klein doesn’t expect it to catch one. If the prototype survives the winter, the next step will be a group of five to seven such stations with equipment custom-designed to do the job. The full array is far in the future.

“One real event would be an accomplishment,” says Klein, “and it might take a hundred stations to achieve even that. UHE cosmic rays are extremely rare. If we can track just one back to its origin, we’ll have made a tremendous advance in neutrino astronomy.”

(Photo: LBNL)

Lawrence Berkeley National Laboratory

LONG-DISTANCE JOURNEYS ARE OUT OF FASHION

0 comentarios

The results of genetic studies on migratory birds substantiate the theory that in the case of a continued global warming, and within only a few generations, migratory birds will - subject to strong selection and microevolution - at first begin to fly shorter distances and at a later stage, stop migrating, and will thus become so-called "residents".

In a selection experiment with blackcaps from southwest Germany, Francisco Pulido and Peter Berthold at the Max Planck Institute for Ornithology in Radolfzell were able to show that first non-migratory birds are to be found in a completely migratory bird population after only two generations of directional selection for lower migratory activity. The strong evolutionary reduction in migration distance found in this study is in line with the expected adaptive changes in bird migration in response to environmental alterations caused by climatic change.

For generations, humans have been watching flocks of migrating birds flying to their winter quarters in the autumn, and awaiting their loud songs announcing their happy return in the spring. The timing of their migration is adjusted to the availability of resources, such as food and habitats, in the stopover areas as well as in the non-breeding and breeding areas. For migratory birds it is essential to be in the right place at the right time.

For some years, it has been possible to demonstrate using data collected in the wild that some species of migratory birds respond to the increase in temperature and to the subsequent changes in the environment. The blackcap is one of the species where changes in migratory behaviour have been most consistent. Today, blackcaps return to their breeding sites earlier, lay their eggs earlier, and leave us later in the autumn. One population even established a new wintering area in the British Isles, instead of flying all the way to Spain. Because of its large genetic variation, the researchers expected rapid adaptation to altered environmental conditions in this species, which is a model for investigating the evolution of bird migration.

The scientists at the Max Planck Institute for Ornithology wanted to find out what the mechanisms were for adjusting to global warming, whether there were measurable changes in migratory behaviour within a period with a strong temperature increase, and whether these changes, above all the reduced migratory distance, were an individual adjustment to altered environmental conditions, or whether the genetic composition of the populations would change.

During the period 1988 - 2001, which were years with particularly high temperatures, blackcap nestlings were taken from their nests each year (757 birds in total) and reared by hand in the lab. The seasonal changes in light-dark transition were simulated and the migratory restlessness of the inexperienced young birds was measured in autumn. The duration of their restless behaviour during the night, i.e. the fluttering and hopping along the perch corresponded approximately to the duration of the flight to their winter quarters.

The birds that were taken from their natural habitat during these 14 years showed a significant reduction in their migratory activity. In their natural habitat this would be equivalent to a shortening of flying distance. This reduction, as the researchers were able to prove, was based on a change in the genetic composition of the population, i.e. evolution.

In a second experiment, the scientists simulated the selection process they had observed in nature in the laboratory, but in "time lapse". The birds with the least migratory activity and their offspring were paired over four generations. In order to avoid inbreeding, the researchers paired 50% of this line with birds in their natural habitat that showed a particularly weak migratory restlessness. After two generations, the first "resident" birds were already to be found in this population. Hence, directional selection for lower migratory activity leads to the evolution of partial migratory populations and, finally, to populations that do not leave their breeding areas at all.

The advantages for the birds are obvious: The shortening of migration distance saves energy and time. Moreover, because shorter days, as experienced in more northern wintering areas, induce an advancement of migratory activity and reproduction, birds migrating shorter distances will occupy the best breeding territories and may produce multiple broods in a year. "We assume that the reduction in migration distance is the first and most significant evolutionary mechanism that migratory birds have for adapting to changed climatic conditions," explains Francisco Pulido. "For birds that migrate short to average distances of approximately 1,000 km, and in which migratory behaviour is genetically determined, as is the case with most songbirds, this can be a successful strategy for survival. However, for long-distance migrants, for which successful migration will depend on overcoming ecological barriers such as desert or sea, this mechanism of adaptation cannot work, as a reduction of migration distance would mean spending the winter in a hostile environment, in which they cannot not survive."

(Photo: Max Planck Institute for Ornithology)

Max Planck Institute

BIZARRE MATTER COULD FIND USE IN QUANTUM COMPUTERS

0 comentarios

There are enticing new findings in the worldwide search for materials that support fault-tolerant quantum computing. New results from Rice University and Princeton University indicate that a bizarre state of matter that acts like a particle with one-quarter electron charge also has a "quantum registry" that is immune to information loss from external perturbations.

The research appeared online April 21 in Physical Review Letters. The team of physicists found that ultracold mixes of electrons caught in magnetic traps could have the necessary properties for constructing fault-tolerant quantum computers -- future computers that could be far more powerful than today's computers. The mixes of electrons are dubbed "5/2 quantum Hall liquids" in reference to the unusual quantum properties that describe their makeup.

"The big goal, the whole driving force, besides deep academic curiosity, is to build a quantum computer out of this," said the study's lead author Rui-Rui Du, professor of physics at Rice. "The key for that is whether these 5/2 liquids have 'topological' properties that would render them immune to the sorts of quantum perturbations that could cause information degradation in a quantum computer."

Du said the team's results indicate the 5/2 liquids have the desired properties. In the parlance of condensed-matter physics, they are said to represent a "non-Abelian" state of matter.

Non-Abelian is a mathematical term for a system with "noncommutative" properties. In math, commutative operations, like addition, are those that have the same outcome regardless of the order in which they are carried out. So, one plus two equals three, just as two plus one equals three. In daily life, commutative and noncommutative tasks are commonplace. For example, when doing the laundry, it doesn't matter if the detergent is added before the water or the water before the detergent, but it does matter if the clothes are washed before they're placed in the dryer.

"It will take a while to fully understand the complete implications of our results, but it is clear that we have nailed down the evidence for 'spin polarization,' which is one of the two necessary conditions that must be proved to show that the 5/2 liquids are non-Abelian," Du said. "Other research teams have been tackling the second condition, the one-quarter charge, in previous experiments."

The importance of the noncommutative quantum properties is best understood within the context of fault-tolerant quantum computers, a fundamentally new type of computer that hasn't been built yet.

Computers today are binary. Their electrical circuits, which can be open or closed, represent the ones and zeros in binary bits of information. In quantum computers, scientists hope to use "quantum bits," or qubits. Unlike binary ones and zeros, the qubits can be thought of as little arrows that represent the position of a bit of quantum matter. The arrow might represent a one if it points straight up or a zero if it points straight down, but it could also represent any number in between. In physics parlance, these arrows are called quantum "states." And for certain complex calculations, being able to represent information in many different states would present a great advantage over binary computing.

The upshot of the 5/2 liquids being non-Abelian is that they have a sort of "quantum registry," where information doesn't change due to external quantum perturbations.

"In a way, they have internal memory of their previous state," Du said.

The conditions needed to create the 5/2 liquids are extreme. At Rice, Tauno Knuuttila, a former postdoctoral research scientist in Du's group, spent several years building the "demagnetization refrigerator" needed to cool 5-millimeter squares of ultrapure semiconductors to within one-10,000th of a degree of absolute zero. It took a week for Knuuttila to simply cool the nearly one-ton instrument to the necessary temperature for the Rice experiments.

The gallium arsenide semiconductors used in the tests are the most pure on the planet. They were created by Loren Pfieiffer, Du's longtime collaborator at Princeton and Bell Labs. Rice graduate student Chi Zhang conducted additional tests at the National High Magnetic Field Laboratory in Tallahassee, Fla., to verify that the 5/2 liquid was spin- polarized.

Study co-authors include Zhang, Knuuttila, Pfeiffer, Princeton's Ken West and Rice's Yanhua Dai. The research is supported by the Department of Energy, the National Science Foundation and the Keck Foundation.

(Photo: Jeff Fitlow/Rice University)

Rice University

LINK DISCOVERED BETWEEN CARBON, NITROGEN MAY PROVIDE NEW WAYS TO MITIGATE POLLUTION PROBLEMS

0 comentarios

A new study exploring the growing worldwide problem of nitrogen pollution from soils to the sea shows that global ratios of nitrogen and carbon in the environment are inexorably linked, a finding that may lead to new strategies to help mitigate regional problems ranging from contaminated waterways to human health.

The University of Colorado at Boulder study found the ratio between nitrates -- a naturally occurring form of nitrogen found in soils, streams, lakes and oceans -- and organic carbon is closely governed by ongoing microbial processes that occur in virtually all ecosystems. The team combed exhaustive databases containing millions of sample points from tropical, temperate, boreal and polar sites, including well-known, nitrogen-polluted areas like Chesapeake Bay, the Baltic Sea and the Gulf of Mexico.

"We have developed a new framework to explain how and why carbon and nitrogen appear to be so tightly linked," said CU-Boulder doctoral student Philip Taylor, lead author on the new study. "The findings are helping us to explain why nitrate can become so high in some water bodies but remain low in others."

A paper by Taylor and CU-Boulder ecology and evolutionary biology Professor Alan Townsend was published in the April 22 issue of Nature. The study was funded in part by the National Science Foundation. Both Taylor and Townsend also are affiliated with CU-Boulder's Institute of Arctic and Alpine Research.

While the vast majority of nitrogen gas is abundant in the atmosphere, it is nonreactive and unavailable to most life, said Townsend. But in 1909 a process was developed to transform the nonreactive gas into ammonia, the active ingredient of synthetic fertilizer. Humans now manufacture more than 400 billion pounds of fertilizer each year -- much of which migrates from croplands into the atmosphere, waterways and oceans -- creating a suite of environmental problems ranging from coastal "dead zones" and toxic algal blooms to ozone pollution and human health issues.

Taylor said the new study indicates that in virtually every area of Earth's environment where there is substantially more dissolved organic carbon than nitrates, the nitrogen is sucked up by microbial communities. "But most of these nitrates are probably not locked away forever," said Taylor. "Instead, they are passed on to other ecosystems, essentially just moving pollution problems elsewhere in the environment."

The consistent relationship between nitrogen and carbon detected in the study was surprising, said Taylor, a doctoral student in CU-Boulder's ecology and evolutionary biology department. "The microbial communities that are controlling this link are found across the globe, whether in pristine environments or in areas of heavy pollution."

Taylor said the CU-Boulder team looked at available data from virtually every ecosystem type, ranging from high-altitude tundra and tropical forests to riparian areas and estuaries. "We looked at a large number of data sets, from sites as small as an office table to as large as entire oceans," said Taylor. "We saw the same correlation between nitrogen and carbon wherever we looked."

"The bottom line is that if there is sufficient organic carbon present, it keeps the nitrates at a low level," said Townsend. "By using available data, we can now make more accurate evaluations of when and where nitrate pollution may pop up." In the February 2010 issue of Scientific American, Townsend and co-author Robert Howarth of Cornell University wrote that "a single new atom of reactive nitrogen can bounce its way around these widespread environments, like a felon on a crime spree."

Nitrogen pollution is increasing globally in part because of fertilizer-intensive activities like biofuel synthesis and meat production that relies on the growth and cultivation of grains used to feed animals. In addition, the burning of fossil fuels -- which releases nitric oxide and nitrogen dioxide -- causes ground-level ozone pollution. Some scientists have ranked nitrogen pollution as one of the top threats to global biodiversity, Townsend said.

High nitrate concentrations in drinking water also is a potential hazard to human health and may cause several types of cancer and elevate risks for Alzheimer's disease and diabetes, while atmospheric nitrogen pollution can elevate cardiopulmonary ailments, said Townsend. In addition, studies also have shown elevated nitrogen concentrations may increase the risks of several other human and wildlife diseases.

Taylor said the new study showed that "downscaling" from a global analysis of the carbon-nitrogen link to system-specific scenarios indicates the relationship between the elements typically becomes even stronger. "Analyzing the problem using these methods at smaller scales could allow ecosystem management teams to better predict and influence the fate of nitrates in the environment," Taylor said.

(Photo: Casey A. Cass/University of Colorado)

University of Colorado at Boulder

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com