Thursday, March 25, 2010

BREAKTHROUGH MAY LEAD TO ABUNDANT ADULT STEM CELLS FOR BONE MARROW TRANSPLANTS

0 comentarios

In a leap toward making stem cell therapy widely available, researchers at the Ansary Stem Cell Institute at Weill Cornell Medical College (WCMC) have discovered that endothelial cells, the most basic building blocks of the vascular system, produce growth factors that can grow copious amounts of adult stem cells and their progeny over the course of weeks.

The finding will likely revolutionize the use of adult stem cells for bone marrow transplants, organ regeneration, and therapies for hearts, brains, skin and lungs, say the researchers.

Until now, adult stem cell cultures would die within four or five days despite best efforts to grow them.

"This is groundbreaking research with potential application for regeneration of organs and inhibition of cancer cell growth," said Dr. Antonio M. Gotto Jr., the Stephen and Suzanne Weiss Dean of WCMC and provost for medical affairs of Cornell.

This new finding sets forth the innovative concept that blood vessels are not just passive conduits to deliver oxygen and nutrients, but are also programmed to maintain and proliferate stem cells and their mature forms in adult organs. Using a novel approach to harness the potential of endothelial cells by "co-culturing" them with stem cells, the researchers discovered how to manufacture an unlimited supply of blood-related stem cells that may eventually ensure that anyone who needs a bone marrow transplant can get one.

The vascular-cell model established in this study could also be used to grow abundant functional stem cells from other organs, such as the brain, heart, skin and lungs.

The findings are published in the March 5 issue of the journal Cell Stem Cell.

In adults, organs have few naturally occurring stem cells, so using them for organ regeneration is impractical. Until now, strategies to expand cultures of adult stem cells, which invariably used animal-based growth factors, serum and genetically manipulated feeder cells, have only been marginally successful. This study, however, uses endothelial cells to propagate stem cells without added growth factors and serum.

"This study will have a major impact on the treatment of any blood-related disorder that requires a stem cell transplant," says senior author, Dr. Shahin Rafii, the Arthur B. Belfer Professor in Genetic Medicine and co-director of the Ansary Stem Cell Institute at WCMC.

Currently, stem cells derived from bone marrow or umbilical cord blood are used to treat patients who require bone marrow transplants. Most stem cell transplants are successful, but because of the shortage of genetically matched bone marrow and umbilical cord blood cells, many patients cannot benefit from the procedure.

If this vascular-based stem cell expansion strategy continues to be validated, physicians could use any source of hematopoietic (blood-producing) stem cells, propagate them exponentially, and bank the cells for transplantation into patients, he said.

In a related study, researchers also discovered that endothelial cells not only could expand stem cells, but also instruct stem cells to generate mature differentiated progeny that could form immune cells, platelets, and red and white blood cells, all of which constitute functioning blood.

The "findings suggest that endothelial cells directly, through expression of stem-cell-active cytokines, promote stem cell reconstitution," said first author Dr. Jason Butler, a senior investigator at WCMC.

(Photo: Cornell U.)

Cornell University

BIRD WINGS MORPH QUICKLY TO ADAPT TO HUMAN-CREATED ENVIRONMENTAL CHANGES

0 comentarios

Can species quickly evolve when humans rapidly change their habitats? The answer, in some cases, is yes.

A new study of North American songbirds finds that major changes in wing shape have occurred over the last 100 years in response to human-driven forest changes.

The study, based on measurements of 851 specimens from 21 songbird species at the Cornell Museum of Vertebrates and the Canadian Museum of Nature in Ottawa, Canada, was published online Feb. 1 in the journal Ecology.

Since 1900, the extent of mature coniferous forests in Québec has shrunk as a result of extensive clear-cutting. That has put some birds under greater pressure to travel further to find mates, food and habitat. To fly the greater distances, such northern songbirds as boreal chickadees, gray jays and Cape May warblers have developed pointier wings.

"It is better for birds to have pointy wings because it is more efficient for sustained flight," said the paper's author, André Desrochers, an animal ecologist at Université Laval in Québec City, Canada, who conducted the research while a visiting researcher at the Cornell Lab of Ornithology in 2009.

On the other hand, since 1900 the forests of New England expanded after they had been heavily deforested in the late 1800s. That has prompted such songbirds as white-breasted nuthatches, hooded warblers and scarlet tanagers over the past 100 years to develop shorter, less pointy wings, presumably because the birds no longer needed to travel as far in search of suitable habitat.

"If a bird is going to forage more in dense vegetation, they don't want cumbersome pointy wings," Desrochers added.

Desrochers referred to a number of famous examples of how selective pressure has led to physical adaptations over short time scales: Beak length of Galapagos finches became shorter over a matter of a few years when an El Niño event changed precipitation patterns, which in turn, resulted in changes in food supply that favored finches with shorter beaks. And in England during the Industrial Revolution, soot made tree trunks dark -- which favored darker tree moths that could blend in and put white tree moths at a disadvantage. But recently, as pollution has abated and tree trunks have lightened, paler moths have reappeared.

"We should not underestimate the ability of species to adapt" to changes to the environment by humans, said Desrochers, adding that conservation measures are still necessary. "Polar bears, for example, are not going to find some sort of breakthrough" as the Arctic ice floes that the bears rely on for traveling and hunting are melting, thereby threatening their survival, he added.

(Photo: A. Desrochers)

Cornell University

TALKING YOUR WAY TO HAPPINESS: WELL-BEING IS RELATED TO HAVING LESS SMALL TALK AND MORE SUBSTANTIVE CONVERSATIONS

0 comentarios
Is a happy life filled with trivial chatter or reflective and profound conversations? Psychological scientists Matthias R. Mehl, Shannon E. Holleran, and C. Shelby Clark from the University of Arizona, along with Simine Vazire of Washington University in St. Louis investigated whether happy and unhappy people differ in the types of conversations they tend to engage in.

Volunteers wore an unobtrusive recording device called the Electronically Activated Recorder (EAR) over four days. This device periodically records snippets of sounds as participants go about their lives. For this experiment, the EAR sampled 30 seconds of sounds every 12.5 minutes yielding a total of more than 20,000 recordings. Researchers then listened to the recordings and identified the conversations as trivial small talk or substantive discussions. In addition, the volunteers completed personality and well-being assessments.

As reported in Psychological Science, a journal of the Association for Psychological Science, analysis of the recordings revealed some very interesting findings. Greater well-being was related to spending less time alone and more time talking to others: The happiest participants spent 25% less time alone and 70% more time talking than the unhappiest participants. In addition to the difference in the amount of social interactions happy and unhappy people had, there was also a difference in the types of conversations they took part in: The happiest participants had twice as many substantive conversations and one third as much small talk as the unhappiest participants.

These findings suggest that the happy life is social and conversationally deep rather than solitary and superficial. The researchers surmise that — though the current findings cannot identify the causal direction — deep conversations may have the potential to make people happier. They note, “Just as self-disclosure can instill a sense of intimacy in a relationship, deep conversations may instill a sense of meaning in the interaction partners.”

Psychological Science

LINK FOUND BETWEEN LIVER DISEASE AND FALLS IN OLD AGE

0 comentarios
A study carried out at Newcastle University has discovered a link between liver disease and the rate of falls in elderly people.

The findings suggest that patients suffering from Primary Biliary Cirrhosis (PBC) are more than twice as likely to fall as non-sufferers and could lead to targeted treatment which would reduce physical and emotional pain, as well as save NHS money.

The research, which is published in the current edition of the Quarterly Journal of Medicine, looked at patients suffering from PBC. Of those, 72% had had at least one fall, significantly higher than the rate of falls in the control group. Fifty five per cent had suffered a fall in the past year and 22% were regular fallers, having had more than one fall in the previous year.

In each case this was more than double the rate for people who do not suffer from PBC. Every year the NHS spends £1b treating falls injuries, about £7.8m of that connected to PBC. This finding could potentially save a significant proportion of that cash, as well as the physical and emotional pain caused by falls.

Many of those who had fallen had suffered serious injuries, including fractures, and one in five of the PBC fallers had to be admitted to hospital as a result of their fall, compared with none from the control group who had to be admitted.

PBC is a chronic condition which mainly affects women, with around 40,000 total sufferers in the UK. It often runs in families and is not caused by drinking too much alcohol.

The findings show that falls and resultant injury are prevalent in PBC and more common than previously recognized. It is believed abnormalities in blood pressure control, poor balance and muscle weakness are causing the falls, as well as poor memory and having a fear of falling (low confidence to go out etc).
Experts say that addressing postural dizziness, poor balance and lower limb weakness using a multi-disciplinary approach has the potential to reduce falls, injuries and deaths, and as a result improve quality of life.

Dr James Frith, clinical research associate at Newcastle Biomedicine, a partnership between Newcastle University and Newcastle NHS Foundation, who led the research said: “Falls cause serious injuries to thousands of people every year. Now we have found this link we may be able to offer treatments to patients who are high risk and hopefully stop some of these falls from happening in the first place. A fall can cause huge emotional issues as well as the physical problems. People lose their confidence and independence after they fall.

“It appears that people with PBC are falling as a result of abnormal regulation of their blood pressure. There is a strong link between PBC and blood pressure regulation. In addition falls also appear to be related to abnormal gait and balance, which is probably a result of abnormalities in the system which controls blood pressure.

“A recent piece of research suggested that that many older women would prefer to die than suffer a broken hip, which would leave them feeling vulnerable and stop them living their life.”

Newcastle University

CARBON EMISSIONS OUTSOURCED TO DEVELOPING COUNTRIES

0 comentarios

A new study by scientists at the Carnegie Institution for Science finds that over a third of carbon dioxide emissions associated with consumption of goods and services in many developed countries are actually emitted outside their borders. Some countries, such as Switzerland, “outsource” over half of their carbon dioxide emissions, primarily to developing countries. The study finds that, per person, about 2.5 tons of carbon dioxide are consumed in the U.S. but produced somewhere else. For Europeans, the figure can exceed four tons per person. Most of these emissions are outsourced to developing countries, especially China.

“Instead of looking at carbon dioxide emissions only in terms of what is released inside our borders, we also looked at the amount of carbon dioxide released during the production of the things that we consume,” says co-author Ken Caldeira, a researcher in the Carnegie Institution’s Department of Global Ecology.

Caldeira and lead author Steven Davis, also at Carnegie, used published trade data from 2004 to create a global model of the flow of products across 57 industry sectors and 113 countries or regions. By allocating carbon emissions to particular products and sources, the researchers were able to calculate the net emissions “imported” or “exported” by specific countries.

“Just like the electricity that you use in your home probably causes CO2 emissions at a coal-burning power plant somewhere else, we found that the products imported by the developed countries of western Europe, Japan, and the United States cause substantial emissions in other countries, especially China,” says Davis. “On the flip side, nearly a quarter of the emissions produced in China are ultimately exported.”

Over a third of the carbon dioxide emissions linked to good and services consumed in many European countries actually occurred elsewhere, the researchers found. In Switzerland and several other small countries, outsourced emissions exceeded the amount of carbon dioxide emitted within national borders.

The United States is both a major importer and a major exporter of emissions embodied in trade. The net result is that the U.S. outsources about 11% of total consumption-based emissions, primarily to the developing world.

The researchers point out that regional climate policy needs to take into account emissions embodied in trade, not just domestic emissions.

“Our analysis of the carbon dioxide emissions associated with consumption in each country just states the facts,” says Caldeira. “This could be taken into consideration when developing emissions targets for these countries, but that’s a decision for policy-makers. One implication of emissions outsourcing is that a lot of the consumer products that we think of as being relatively carbon-free may in fact be associated with significant carbon dioxide emissions.”

“Where CO2 emissions occur doesn’t matter to the climate system,” adds Davis. “Effective policy must have global scope. To the extent that constraints on developing countries’ emissions are the major impediment to effective international climate policy, allocating responsibility for some portion of these emissions to final consumers elsewhere may represent an opportunity for compromise.”

The report is published online in the March 8, 2010 Proceedings of the National Academy of Sciences.

(Photo: Steven Davis/Carnegie Institution for Science)

Carnegie Institution for Science

ASTEROID KILLED OFF THE DINOSAURS, SAYS INTERNATIONAL SCIENTIFIC PANEL

0 comentarios

The Cretaceous–Tertiary mass extinction, which wiped out the dinosaurs and more than half of species on Earth, was caused by an asteroid colliding with Earth and not massive volcanic activity, according to a comprehensive review of all the available evidence, published today in the journal Science.

A panel of 41 international experts, including UK researchers from Imperial College London, the University of Cambridge, University College London and the Open University, reviewed 20 years’ worth of research to determine the cause of the Cretaceous–Tertiary (KT) extinction, which happened around 65 million years ago. The extinction wiped out more than half of all species on the planet, including the dinosaurs, bird-like pterosaurs and large marine reptiles, clearing the way for mammals to become the dominant species on Earth.

Today’s review of the evidence shows that the extinction was caused by a massive asteroid slamming into Earth at Chicxulub (pronounced chick- shoo-loob) in Mexico. The asteroid, which was around 15 kilometres wide, is believed to have hit Earth with a force one billion times more powerful than the atomic bomb at Hiroshima. It would have blasted material at high velocity into the atmosphere, triggering a chain of events that caused a global winter, wiping out much of life on Earth in a matter of days.

Scientists have previously argued about whether the extinction was caused by the asteroid or by volcanic activity in the Deccan Traps in India, where there were a series of super volcanic eruptions that lasted approximately 1.5 million years. These eruptions spewed 1,100,000 km3 of basalt lava across the Deccan Traps, which would have been enough to fill the Black Sea twice, and were thought to have caused a cooling of the atmosphere and acid rain on a global scale.

In the new study, scientists analysed the work of palaeontologists, geochemists, climate modellers, geophysicists and sedimentologists who have been collecting evidence about the KT extinction over the last 20 years. Geological records show that the event that triggered the extinction destroyed marine and land ecosystems rapidly, according to the researchers, who conclude that the Chicxulub asteroid impact is the only plausible explanation for this.

Despite evidence for relatively active volcanism in Deccan Traps at the time, marine and land ecosystems showed only minor changes within the 500,000 years before the time of the KT extinction. Furthermore, computer models and observational data suggest that the release of gases such as sulphur into the atmosphere after each volcanic eruption in the Deccan Traps would have had a short lived effect on the planet. These would not cause enough damage to create a rapid mass extinction of land and marine species.

Dr Joanna Morgan, co-author of the review from the Department of Earth Science and Engineering at Imperial College London, said:

“We now have great confidence that an asteroid was the cause of the KT extinction. This triggered large-scale fires, earthquakes measuring more than 10 on the Richter scale, and continental landslides, which created tsunamis. However, the final nail in the coffin for the dinosaurs happened when blasted material was ejected at high velocity into the atmosphere. This shrouded the planet in darkness and caused a global winter, killing off many species that couldn’t adapt to this hellish environment.”

Dr Gareth Collins, Natural Environment Research Council Fellow and another co-author from the Department of Earth Science and Engineering at Imperial College London, added:

“The asteroid was about the size of the Isle of Wight and hit Earth 20 times faster than a speeding bullet. The explosion of hot rock and gas would have looked like a huge ball of fire on the horizon, grilling any living creature in the immediate vicinity that couldn’t find shelter. Ironically, while this hellish day signalled the end of the 160 million year reign of the dinosaurs, it turned out to be a great day for mammals, who had lived in the shadow of the dinosaurs prior to this event. The KT extinction was a pivotal moment in Earth’s history, which ultimately paved the way for humans to become the dominant species on Earth.”

In the review, the panel sifted through past studies to analyse the evidence that linked the asteroid impact and volcanic activity with the KT extinction. One key piece of evidence was the abundance of iridium in geological samples around the world from the time of the extinction. Iridium is very rare in Earth’s crust and very common in asteroids. Immediately after the iridium layer, there is a dramatic decline in fossil abundance and species, indicating that the KT extinction followed very soon after the asteroid hit.

Another direct link between the asteroid impact and the extinction is evidence of ‘shocked’ quartz in geological records. Quartz is shocked when hit very quickly by a massive force and these minerals are only found at nuclear explosion sites and at meteorite impacts sites. The team say that an abundance of shocked quartz in rock layers all around the world at the KT boundary lends further weight to their conclusions that a massive meteorite impact happened at the time of the mass extinction.

The panel was able to discount previous studies that suggested that the Chicxulub impact occurred 300,000 years prior to the KT extinction. The researchers say that these studies had misinterpreted geological data that was gathered close to the Chicxulub impact site. This is because the rocks close to the impact zone underwent complex geological processes after the initial asteroid collision, which made it difficult to interpret the data correctly.

(Photo: ICL)

Imperial College London

NEW INSIGHT ON HOW FAST NICOTINE PEAKS IN THE BRAIN

0 comentarios
Nicotine takes much longer than previously thought to reach peak levels in the brains of cigarette smokers, according to new research conducted at Duke University Medical Center.

Traditionally, scientists thought nicotine inhaled in a puff of cigarette smoke took a mere seven seconds to be taken up by the brain, and that each puff produced a spike of nicotine. Using Positron Emission Tomography (PET) imaging, Duke investigators illustrate, for the first time, that cigarette smokers actually experience a steady rise of brain nicotine levels during the course of smoking a whole cigarette.

The findings, online in the Early Edition of Proceedings of the National Academy of Sciences (PNAS), could lead to more effective treatments for smoking addiction.

“Previously it was thought that the puff-by-puff spikes of nicotine reaching the brain explained why cigarettes are so much more addictive than other forms of nicotine delivery, like the patch or gum,” says Jed Rose, PhD, director of the Duke Center for Nicotine and Smoking Cessation Research.

“Our work now calls into question whether addiction has to do with the puff-by-puff delivery of nicotine. It may actually depend in part on the overall rate at which nicotine reaches and accumulates in the brain, as well as the unique habit and sensory cues associated with smoking.”

Yet, when the researchers compared 13 dependent smokers to 10 non-dependent smokers, they were surprised to find the dependent smokers had a slower rate of nicotine accumulation in the brain. “This slower rate resulted from nicotine staying longer in the lungs of dependent smokers, which may be a result of the chronic effects of smoke on the lungs,” surmises Rose.

The difference in rate of nicotine accumulation in the brain doesn’t explain why some people become addicted to cigarettes and others don’t.

“Even if you correct for the speed of delivery, our study showed the non-dependent smokers eventually experienced the same high levels of nicotine in their brain as dependent smokers, yet they did so without becoming dependent. The real mystery is why.”

Rose says the absence of addiction in these smokers could be due to genetic differences, differences in the way they smoke, or differences in the psychological effects they derive. “We’re still not able to fully explain why these people are able to smoke without becoming addicted.”

Despite the questions raised, the study provides important insights into the role of the speed and level of brain nicotine levels, and which receptors in the brain are at work.

“Different receptors respond to nicotine at different levels of sensitivity,” says Rose. “Knowing the levels of nicotine that are really getting to the brain gives us clues as to which receptors are more likely to be important for the dependence-producing effects of cigarette smoking.”

Duke University Medical Center

UNSELFISH MOLECULES MAY HAVE HELPED GIVE BIRTH TO THE GENETIC MATERIAL OF LIFE

0 comentarios
One of the biggest questions facing scientists today is how life began. How did non-living molecules come together in that primordial ooze to form the polymers of life? Scientists at the Georgia Institute of Technology have discovered that small molecules could have acted as “molecular midwives” in helping the building blocks of life’s genetic material form long chains and may have assisted in selecting the base pairs of the DNA double helix. The research appears in the online early edition of the Proceedings of the National Academy of Sciences beginning March 8, 2010.

“Our hypothesis is that before there were protein enzymes to make DNA and RNA, there were small molecules present on the pre-biotic Earth that helped make these polymers by promoting molecular self-assembly,” said Nicholas V. Hud, professor in the School of Chemistry and Biochemistry at the Georgia Institute of Technology. “We’ve found that the molecule ethidium can assist short oligonucleotides in forming long polymers and can also select the structure of the base pairs that hold together two strands of DNA.”

One of the biggest problems in getting a polymer to form is that, as it grows, its two ends often react with each other instead of forming longer chains. The problem is known as strand cyclization, but Hud and his team discovered that using a molecule that binds between neighboring base pairs of DNA, known as an intercalator, can bring short pieces of DNA and RNA together in a manner that helps them create much longer molecules.

“If you have the intercalator present, you can get polymers. With no intercalator, it doesn’t work, it’s that simple,” said Hud.

Hud and his team also tested how much influence a midwife molecule might have had on creating DNA’s Watson-Crick base pairs (A pairs with T, and G pairs with C). They found that the midwife used could determine the base pairing structure of the polymers that formed. Ethidium was most helpful for forming polymers with Watson-Crick base pairs. Another molecule that they call aza3 made polymers in which each A base is paired with another A.

“In our experiment, we found that the midwife molecules present had a direct effect on the kind of base pairs that formed. We’re not saying that ethidium was the original midwife, but we’ve shown that the principle of a small molecule working as a midwife is sound. In our lab, we’re now searching for the identity of a molecule that could have helped make the first genetic polymers, a sort of ‘unselfish’ molecule that was not part of the first genetic polymers, but was critical to their formation,” said Hud.

Georgia Institute of Technology

EXPLAINED: RADIATIVE FORCING

0 comentarios

When people talk about global warming or the greenhouse effect, the main underlying scientific concept that describes the process is radiative forcing. And despite all the recent controversy over leaked emails and charges of poorly sourced references in the last Intergovernmental Panel on Climate Change report, the basic concept of radiative forcing is one on which scientists — whatever their views on global warming or the IPCC — all seem to agree. Disagreements come into play in determining the actual value of that number.

The concept of radiative forcing is fairly straightforward. Energy is constantly flowing into the atmosphere in the form of sunlight that always shines on half of the Earth’s surface. Some of this sunlight (about 30 percent) is reflected back to space and the rest is absorbed by the planet. And like any warm object sitting in cold surroundings — and space is a very cold place — some energy is always radiating back out into space as invisible infrared light. Subtract the energy flowing out from the energy flowing in, and if the number is anything other than zero, there has to be some warming (or cooling, if the number is negative) going on.

It’s as if you have a kettle full of water, which is at room temperature. That means everything is at equilibrium, and nothing will change except as small random variations. But light a fire under that kettle, and suddenly there will be more energy flowing into that water than radiating out, and the water is going to start getting hotter.

In short, radiative forcing is a direct measure of the amount that the Earth’s energy budget is out of balance.

For the Earth’s climate system, it turns out that the level where this imbalance can most meaningfully be measured is the boundary between the troposphere (the lowest level of the atmosphere) and the stratosphere (the very thin upper layer). For all practical purposes, where weather and climate are concerned, this boundary marks the top of the atmosphere.

While the concept is simple, the analysis required to figure out the actual value of this number for the Earth right now is much more complicated and difficult. Many different factors have an effect on this balancing act, and each has its own level of uncertainty and its own difficulties in being precisely measured. And the individual contributions to radiative forcing cannot simply be added together to get the total, because some of the factors overlap — for example, some different greenhouse gases absorb and emit at the same infrared wavelengths of radiation, so their combined warming effect is less than the sum of their individual effects.

In its most recent report in 2007, the IPCC produced the most comprehensive estimate to date of the overall radiative forcing affecting the Earth today. Ronald Prinn, the TEPCO Professor of Atmospheric Science and director of MIT’s Center for Global Change Science, was one of the lead authors of that chapter of the IPCC’s Fourth Assessment Report. Radiative forcing “was very small in the past, when global average temperatures were not rising or falling substantially,” he explains. For convenience, most researchers choose a “baseline” year before the beginning of world industrialization — usually either 1750 or 1850 — as the zero point, and compute radiative forcing in relation to that base. The IPCC uses 1750 as its base year and it is the changes in the various radiative forcing agents since then that are counted.

Thus radiative forcing, measured in watts per square meter of surface, is a direct measure of the impact that recent human activities — including not just greenhouse gases added to the air, but also the impact of deforestation, which changes the reflectivity of the surface — are having on changing the planet’s climate. However, this number also includes any natural effects that may also have changed during that time, such as changes in the sun’s output (which has produced a slight warming effect) and particles spewed into the atmosphere from volcanoes (which generally produce a very short-lived cooling effect, or negative forcing).

Although all of the factors that influence radiative forcing have uncertainties associated with them, one factor overwhelmingly affects the uncertainty: the effects of aerosols (small airborne particles) in the atmosphere. That’s because these effects are highly complex and often contradictory. For example, bright aerosols (like sulfates from coal-burning) are a cooling mechanism, whereas dark aerosols (like black carbon from diesel exhausts) lead to warming. Also, adding sulfate aerosols to clouds leads to smaller but more abundant droplets that increase cloud reflectivity, thus cooling the planet.

“The error bars in the greenhouse gas forcing are very small,” Prinn says. “The biggest uncertainty in defining radiative forcing comes from aerosols.”

So, given all these factors and their range of errors, what’s the answer? The current level of radiative forcing, according to the IPCC AR4, is 1.6 watts per square meter (with a range of uncertainty from 0.6 to 2.4). That may not sound like much, Prinn says, until you consider the total land area of the Earth and multiply it out, which gives a total warming effect of about 800 terawatts — more than 50 times the world’s average rate of energy consumption, which is currently about 15 terawatts.

(Photo: NASA Johnson Space Center (NASA-JSC))

Massachusetts Institute of Technology

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com