Tuesday, May 11, 2010


0 comentarios

The seasonal monsoon rains in Asia feed nearly half the world’s population, and when the rains fail to come, people can go hungry, or worse. A new study of tree rings provides the most detailed record yet of at least four epic droughts that have shaken Asia over the last thousand years, from one that may have helped bring down China’s Ming Dynasty in 1644, to another that caused tens of millions of people to starve to death in the late 1870s. The study, published this week in the journal Science, is expected not only to help historians understand how environment has affected the past, but to aid scientists trying to understand the potential for large-scale disruptions of weather in the face of changing climate.

By sampling the wood of thousands of ancient trees across Asia, scientists at Columbia University’s Lamont-Doherty Earth Observatory assembled an atlas of past droughts, gauging their relative severity across vast expanses of time and space. “Global climate models fail to accurately simulate the Asian monsoon, and these limitations have hampered our ability to plan for future, potentially rapid and heretofore unexpected shifts in a warming world,” said Edward Cook, head of Lamont’s Tree Ring Lab, who led the study. “Reliable instrumental data goes back only until 1950. This reconstruction gives climate modelers an enormous dataset that may produce some deep insights into the causes of Asian monsoon variability.” There is some evidence that changes in the monsoon are driven at least in part by cyclical changes in sea-surface temperatures. Some scientists have speculated that warming global temperatures could alter these cycles and possibly make some of them more intense, but at this point there is no consensus on whether or how they might change.

For some tree species, rainfall determines the width of their annual growth rings, and these rings are what the scientists were able to read. The researchers spent more than 15 years traveling across Asia to locate trees old enough to provide long-term records. The hunt took them to more than 300 sites, from Siberia down to Indonesia and northern Australia, as far west as Pakistan and as far east as Japan. The project involved collaborations with numerous national governments, local villages and other university scientists. “It’s everything from low-land rain forests to high in the Himalayas,” said study coauthor Kevin Anchukaitis, a Lamont tree ring scientist. “You have a tremendous diversity of environment, climate influences and species.”

The tree-ring records in the study reveal at least four great droughts that are linked to catastrophic events in history. For starters, the study suggests that climate may have played a powerful role in the 1644 fall of China’s Ming dynasty. The tree rings provide additional evidence of a severe drought in China referenced in some historical texts as the worst in five centuries. This study narrows it down to a three-year period, 1638-1641. The drought was most sharply expressed in northeastern China, near Beijing, and is thought to have influenced peasant rebellions that hastened the demise of the Ming.

Another severe monsoon failure came in 1756-1768, coinciding with the collapse of kingdoms in what are now Vietnam, Myanmar and Thailand. The drought roiled political structures all the way to Siberia, and the tree rings also indicate that western India was severely affected. This drought is not documented in historical records; scientists first identified it in teak rings from Thailand, and later in Vietnamese cypress trees. Some historians have speculated that climate must have played a role for such sweeping political changes to have happened simultaneously; fragmentary accounts suggest that dry periods may have been punctuated with devastating floods. The study appears to provide an explanation for the so-called “strange parallels” that Victor Lieberman, an historian at the University of Michigan, has spent his career studying. “It provides confirmation that there are very strong climate links between monsoon regimes in India, Southeast Asia and southern China,” said Lieberman in an interview.

Then, the so-called East India drought hit in 1790-1796. This one appears to have been felt worldwide, spreading civil unrest and socioeconomic turmoil. For instance, in Mexico, water levels at Lake Pátzcuaro fell so much they gave rise to ownership disputes over the land that emerged. In Europe, drought led to crop failures that preceded the French Revolution. Famines hit India.

Perhaps the worst drought, the scientists found, was the Victorian-era “Great Drought” of 1876-1878. The effects were felt across the tropics; by some estimates, resulting famines killed up to 30 million people. According to the tree-ring evidence, the effects were especially acute in India, but extended as far away as China and present-day Indonesia. Colonial-era policies left regional societies ill-equipped to deal with the drought’s consequences, as historian Mike Davis details in his book Late Victorian Holocausts. Famine and cholera outbreaks at this time in colonial Vietnam fueled a peasant revolt against the French.

The study follows a related report last month by the Lamont tree-ring team suggesting that dramatic variations in the monsoon may have influenced the collapse of the ancient Khmer civilization at Angkor nearly 600 years ago, in what is now Cambodia. That paper, appearing in the Proceedings of the National Academy of Sciences, showed evidence of a mega-drought in the wider region around Angkor from the 1340s to the 1360s, followed by a more severe but shorter drought from the 1400s to 1420s. The droughts were interspersed with severe flooding, and the kingdom collapsed shortly after. The scientist who led that study, Brendan Buckley, coauthored the present drought atlas.

Scientists aren’t exactly sure how factors such as volcanic eruptions, greenhouse gases and variations in solar output combine to drive the many variations in the monsoon over the long term. Over shorter time periods, variations seem to be more closely linked to the El Niño-Southern Oscillation (ENSO), the warming and cooling of the tropical Pacific atmosphere-ocean system. Separate studies suggest that El Niño, the warm phase of ENSO, often coincides with a weak monsoon and droughts; it also seems linked to weather changes in Africa and parts of South America. The deadly 1876-1878 drought coincided with one of the most extreme El Niños of the last 150 years. However, the parallels are not perfect, so other factors may come into play at different times, including changes in snow cover over Asia and cycles of sea-surface temperature in the Indian Ocean. There is intense interest in how El Niño and other phenomena may be affected by a warming climate, and how monsoon extremes may affect the growing populations that depend on the rains. Southern China is currently suffering its worst drought in 80 to 100 years, bringing not only water shortages, but tensions with Southeast Asian nations downstream of its watersheds.

Data from the drought atlas is already providing information on particular regions, say the scientists. Using the Indonesia tree ring records, for example, Lamont scientist and study coauthor Rosanne D’Arrigo has reconstructed stream flow in Java’s Citarum river basin, a region that waters much of Indonesia’s rice. In a recent study in the journal Climate Dynamics, D’Arrigo found a close link between El Niños and weak monsoon rains or drought in Indonesia over the last 250 years.

The atlas is valuable to monsoon forecasters because the record is long enough and the spatial areas detailed enough that modelers can pick out short-term and long-term patterns, said Bin Wang, a meteorologist and monsoon modeler at the University of Hawaii who was not involved in the study. “It is extremely valuable for validating climate models’ simulation and understanding their origins in terms of model physics,” he said.

(Photo: Brendan Buckley)

Columbia University


0 comentarios
We know more about distant galaxies than we do about the interior of our own planet. However, by observing distant earthquakes, researchers at the University of Calgary have revealed new clues about the top of the Earth’s core in a paper published in the May edition of the journal Physics of the Earth and Planetary Interiors.

Knowledge of the composition and state in this zone is key to unraveling the source of the Earth’s magnetic field and the formation of our planet.

“Some scientists have proposed a region of sediment accumulation at the top of the core, or even distinct liquid layers, but this study shows that the outer core is, in fact, well mixed,” says professor Dave Eaton, co-author of the paper. “This inaccessible region is composed of molten iron, nickel and other as-yet unknown lighter elements such as silicon, sulfur, carbon or oxygen.”

To help try and determine the materials that make up the Earth’s core, which is 2,891 km below the surface, Eaton and co-author Catrina Alexandrakis, University of Calgary PhD student, measured the seismic wave speed (speed of sound) at the top of Earth’s core.

“Observation of distant earthquakes is one of the few tools that scientists have to investigate deep parts of the Earth,” says Alexandrakis. “This isn't the first time earthquake data has been used, but our research method is the most definitive to date.”
The researchers’ method is based on ‘listening’ to earthquakes on the other side of the planet using an approach that is akin to hearing a conversation across a whispering gallery, such as those in the domes of some large cathedrals.

Using a novel digital processing approach, they analyzed faint signals, produced by 44 earthquakes, and were able to measure the sound speed at the top of Earth’s core with unprecedented accuracy.

Their results will help to guide research efforts at laboratories where core composition is studied by simulating extreme pressure and temperature conditions that exist in the Earth’s core.

University of Calgary


0 comentarios

"Hot sounds" has one meaning to music fans and another to physicists. Count a team of researchers at Rice University among the latter, as they've discovered that acoustic waves traveling along ribbons of graphene might be just the ticket for removing heat from very tiny electronic devices.

A theoretical model by Rice physicist Boris Yakobson and his students has determined graphene – a single-layer honeycomb of carbon atoms and the focus of much materials science and electronics research – can transmit thermal energy in waves. Given the elastic properties of graphene, long waves of the acoustic kind seem to work best. Because the scattering properties of graphene are low, such waves can go fast and far, unobstructed by each other or by imperfections in the material.

You'd never hear anything, no matter how close you put your ear to the nanoscale ribbon, Yakobson said. But to the researchers, the implications are clear as a bell.

"On this scale, graphene has promise for fundamental reasons," said Yakobson, a Rice professor in mechanical engineering and materials science and of chemistry and part of a program recently named No. 1 in the world for the quality of its materials science research. "The speed of sound is the speed with which energy can be carried away, because heat is transported, essentially, through vibrations."

Yakobson and his co-authors, former postdoctoral associate Enrique Muñoz, now an assistant professor in the Department of Mathematics and Physics at the University of Playa Ancha in Chile, and Jianxin Lu, a Rice graduate student, published their results last week in the online edition of the journal Nano Letters.

Muñoz, the paper's primary author, said the "nearly ballistic behavior" of phonons, quantum particles considered sound's equivalent to light's photons, makes the graphene material 10 times better than copper or gold at conducting heat.

The trick to making such graphene-enabled heat pipes effective will be to figure out where the heat goes when it gets to the end of the ribbon, an issue Lu continues to study for both nanoribbons and nanotubes. Without an effective interface, the propagating waves of phonons would simply bounce back.

"You need another medium," Yakobson said. "That's why I say this is more of a heat pipe than a heat sink, because at the far end of the graphene, you need contact with fluid, in a gas or liquid phase, so this wave energy can dissipate."

The power density of current microelectronics would, on a macro scale, be enough to heat a teapot to boiling in seconds. So it's becoming increasingly important to remove heat from sensitive instruments and release it to the air in a hurry.

"We're dealing with a very high heat density – maybe a kilowatt per centimeter square," Yakobson said. "When you want to barbecue, such heat is very useful. But in this case, you'd basically barbecue your device."

Finding a way to deal with transmitting heat away from ever-smaller devices is critical to sustaining Moore's Law, which accurately predicted (so far) that the number of transistors that could be placed on an integrated circuit would double about every two years.

"Another interesting application of these ribbons is in the construction of phonon waveguides," Muñoz added. "Graphene ribbons could be pieces in a nanoscale circuit where phonons, instead of electrons, serve as information carriers in a different computer architecture."

(Photo: Rice U.)

Rice University


0 comentarios

New research indicates that one of the largest fresh-water floods in Earth's history happened about 17,000 years ago and inundated a large area of Alaska that is now occupied in part by the city of Wasilla, widely known because of the 2008 presidential campaign.

The event was one of at least four "megafloods" as Glacial Lake Atna breached ice dams and discharged water. The lake covered more than 3,500 square miles in the Copper River Basin northeast of Anchorage and Wasilla.

The megaflood that covered the Wasilla region released as much as 1,400 cubic kilometers, or 336 cubic miles, of water, enough to cover an area the size of Washington, D.C., to a depth of nearly 5 miles. That water volume drained from the lake in about a week and, at such great velocity, formed dunes higher than 110 feet, with at least a half-mile between crests. The dunes appear on topographical maps but today are covered by roads, buildings and other development.

"Your mind doesn't get around dunes of that size. Obviously the water had to be very deep to form them," said Michael Wiedmer, an Anchorage native who is pursuing graduate studies in forest resources at the University of Washington.

Wiedmer is the lead author of a paper describing the Wasilla-area megaflood, published in the May edition of the journal Quaternary Research. Co-authors are David R. Montgomery and Alan Gillespie, UW professors of Earth and space sciences, and Harvey Greenberg, a computer specialist in that department.

By definition, a megaflood has a flow of at least 1 million cubic meters of water per second (a cubic meter is about 264 gallons). The largest known fresh-water flood, at about 17 million cubic meters per second, originated in Glacial Lake Missoula in Montana and was one of a series of cataclysmic floods that formed the Channeled Scablands of eastern Washington.

The megaflood from Glacial Lake Atna down what is now the Matanuska River to the Wasilla region might have had a flow of about 3 million cubic meters per second. Another suspected Atna megaflood along a different course to the Wasilla region, down the Susitna River, might have had a flow of about 11 million cubic meters per second. The researchers also found evidence for two smaller Atna megafloods, down the Tok and Copper rivers.

Wiedmer, who retired from the Alaska Department of Fish and Game in 2006, began the research in 2005 when he discovered pygmy whitefish living in Lake George, a glacial lake 50 miles from Anchorage. The lake has essentially emptied numerous times in its history and was not thought to support much life. Examination of physical traits indicate those fish are more closely related to pygmy whitefish in three other mountain lakes, all remnants of Lake Atna, than they are to any others of that species. Their existence in Lake George, some distance from the other lakes, is one piece of evidence for a megaflood from Lake Atna.

"Lake Atna linked up with four distinct drainages, and we think that helped it act like a pump for freshwater organisms," he said.

The megaflood also could explain some of the catastrophic damage that occurred in the magnitude 9.2 Great Alaskan Earthquake of 1964. Wiedmer noted that much of Anchorage is built on marine sediments, and one layer of those sediments liquefied and collapsed, allowing the layer above to slide toward the sea. As the upper layer moved toward the water, structures built on top of it collapsed.

Though the marine sediments extend about 200 feet deep, the failure only occurred within a narrow 3-foot layer. Scientists later discovered that layer had been infused with fresh water, which was unexpected in sediments deposited under salt water. The ancient megaflood could account for the fresh water.

"We suspect that this is evidence of the flood that came down the Matanuska," Wiedmer said. "The location is right at the mouth of where the flood came down, and the time appears to be right."

(Photo: Michael Wiedmer)

University of Washington


0 comentarios

A series of novel imaging agents could light up tumors as they begin to form – before they turn deadly – and signal their transition to aggressive cancers.

The compounds – fluorescent inhibitors of the enzyme cyclooxygenase-2 (COX-2) – could have broad applications for detecting tumors earlier, monitoring a tumor's transition from pre-malignancy to more aggressive growth, and defining tumor margins during surgical removal.

"We're very excited about these new agents and are moving forward to develop them for human clinical trials," said Lawrence Marnett, Ph.D., the leader of the Vanderbilt University team that developed the compounds, which are described in the May 1 issue of Cancer Research.

COX-2 is an attractive target for molecular imaging. It's not found in most normal tissues, and then it is "turned on" in inflammatory lesions and tumors, Marnett explained.

"COX-2 is expressed at the earliest stages of pre-malignancy – in pre-malignant lesions, but not in surrounding normal tissue – and as a tumor grows and becomes increasingly malignant, COX-2 levels go up," Marnett said.

Compounds that bind selectively to COX-2 – and carry a fluorescent marker – should act as "beacons" for tumor cells and for inflammation.

Marnett and his colleagues previously demonstrated that fluorescent COX-2 inhibitors – which they have now dubbed "fluorocoxibs" – were useful probes for protein binding, but their early molecules were not appropriate for cellular or in vivo imaging.

"It was a real challenge to make a compound that is COX-2 selective (doesn't bind to the related COX-1 enzyme), has desirable fluorescence properties, and gets to the tissue in vivo," Marnett said.

To develop such compounds, Jashim Uddin, Ph.D., research assistant professor of Biochemistry, started with the "core" chemical structure of the anti-inflammatory medicines indomethacin and celecoxib. He then tethered various fluorescent parts to the core structure, ultimately synthesizing more than 200 compounds. The group tested each compound for its interaction with purified COX-2 and COX-1 proteins and then assessed promising compounds for COX-2 selectivity and fluorescence in cultured cells and in animals. Two compounds made the cut.

In studies led by senior research specialist Brenda Crews, the investigators evaluated the potential of these compounds for in vivo imaging using three different animal models: irritant-induced inflammation in the mouse foot pad; human tumors grafted into mice; and spontaneous tumors in mice.

In each case, the two fluorocoxibs – injected intravenously or into the abdominal cavity – accumulated in the inflamed or tumor tissue, giving it a fluorescent "glow."

To move the agents toward human clinical trials, the team will conduct additional toxicology and pharmacology testing and develop the tools for particular settings that are amenable to fluorescence imaging, such as skin or sites accessible by endoscope (e.g., esophagus and colon).

In the esophagus, for example, a pre-malignant lesion called Barrett's esophagus can transition to a low-grade dysplasia, then to a high-grade dysplasia, and finally to malignant cancer, which has a one-year survival of only 10 percent. For a patient with Barrett's esophagus, detecting the transition to dysplasia is critical. The problem is that dysplasia is not visibly different from the pre-malignant Barrett's lesion, so physicians collect random biopsy samples – which might miss areas of dysplasia.

"If instead, the physician could look through the endoscope and see a nest of cells lighting up with these fluorocoxibs – that is where they could biopsy," Marnett said.

"Because COX-2 levels increase during cancer progression in virtually all solid tumors, we think these imaging tools will have many, many different applications."

The investigators also are exploring using the compounds to target delivery of chemotherapeutic drugs directly to COX-2-expressing cells – by tethering an anti-cancer drug instead of a fluorescent marker to the COX-2 inhibitor core.

(Photo: Lawrence Marnett, Ph.D., and colleagues)

Vanderbilt University


0 comentarios
Scientists have combined chemistry and biology research techniques to explain how certain bacteria grow structures on their surfaces that allow them to simultaneously cause illness and protect themselves from the body’s defenses.

The researchers are the first to reproduce a specific component of this natural process in a test tube – an essential step to fully understanding how these structures grow.

With the new method described, these and other researchers now can delve even deeper into the various interactions that must occur for these structures – called lipopolysaccharides – to form, potentially discovering new antibiotic targets along the way.

Lipopolysaccharides are composed primarily of polysaccharides – strings of sugars that are attached to bacterial cell surfaces. They help bacteria hide from the immune system and also serve as identifiers of a given type of bacteria, making them attractive targets for drugs. But before a drug can be designed to inhibit their growth, scientists must first understand how polysaccharides are developed in the first place.

“We were able to answer some of the questions about how components of this growth system do their jobs. This will allow us to more fully characterize lipopolysaccharide biosynthesis in vitro, a process which may shed light on useful targets for developing antibiotic agents,” said Robert Woodward, a graduate student in chemistry at Ohio State University and lead author of the study.

The study is published in the April 25 online edition of the journal Nature Chemical Biology.

The researchers used a harmless strain of Escherichia coli as a model for this work, which would apply to other E. coli strains and similar Gram-negative bacteria, a reference to how their cell walls are structured.

The surface of these bacteria house the lipopolysaccharide, which is a three-part molecular structure embedded into the cell membrane. Two sections of this structure are well understood, but the third, called the O-polysaccharide, has to date been impossible to reproduce.

Two significant challenges have hindered research efforts in this area: The five sugars strung together to compose this section of the molecule are difficult to chemically prepare in the lab, and one of the key enzymes that initiates the structure’s growth process doesn’t easily function in a water-based solution in a test tube.

Ohio State synthetic chemists and biochemists put their heads together to solve these two problems, Woodward said.

To produce the five-sugar chain, the researchers started with a chemically prepared building block containing a single sugar and introduced enzymes that generated a five-sugar unit from that single carbohydrate.

“The first part was done chemically, and in the second part, we used the exact same enzymes that are normally present in a bacterial cell to transform the single sugar into a five-sugar string,” Woodward said.

Once these sugars join to make a five-sugar chain, a specific number of these chains are joined together to fully form the O-polysaccharide. A protein is required to connect those chains – the protein that doesn’t respond well to the test-tube environment.

Early attempts to produce this protein in the lab resulted in clumping structures that did not function. So Woodward and colleagues produced this protein in the presence of what are known as “chaperone” proteins.

“And basically what the chaperones do is help the protein fold into its correct state. We were able to produce the desired enzyme and also were able to verify that it was functional,” Woodward said.

This protein is called Wzy. It is a sugar polymerase, or an enzyme that interacts with the five-sugar chain to begin the process of linking several five-sugar units together.

Getting this far into the process was important, but the researchers also completed one additional step to define yet another protein’s role.

Wzy connected the five-sugar chains, but it did so with no defined limit to the number of five-sugar units involved, a feature that does not match the natural process. On an actual bacterial cell wall, the length of the polysaccharide falls within a relatively narrow range of the number of chains connected.

So the scientists introduced another protein, called Wzz, to the mixture. This protein is known as a “chain length regulator.” With this protein in the mix, the lengths of the resulting polysaccharides were confined to a much more narrow range.

“We were able to replicate the exact polysaccharide biosynthetic pathway in vitro, getting the correct lengths,” Woodward said. “This is important because now you can begin to look at a whole host of other properties in the system.”

The group already started trying to answer one compelling question: whether the two proteins, Wzy and Wzz, have to interact to fully achieve formation of the polysaccharide.

“We’ve shown in some preliminary results that they do interact, but we haven’t determined whether that interaction has any functional relevance,” Woodward said.

With this knowledge in hand, researchers now have access to information about how all three parts of the lipopolysaccharide, the large biomolecule on Gram-negative bacteria cell surfaces, is formed. One thing they already knew is that the entire process takes place on an inner membrane and is then exported to the outer membrane on the cell surface.

Now that scientists can reproduce formation of the lipopolysaccharide, they can more directly characterize the export process – a step in the pathway that serves as another potential antibiotic target, Woodward noted.

Ohio State University




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com