Wednesday, February 10, 2010

THE WARMEST DECADE

0 comentarios

The decade 2000-2009 was the warmest since modern recordkeeping began, and 2009 was tied for the second warmest single year, a new analysis of global surface temperature shows. The analysis, conducted each year by the NASA Goddard Institute for Space Studies (GISS), an affiliate of the Earth Institute, also shows that in half the world--the Southern Hemisphere--2009 was the warmest year yet recorded.

"There's always an interest in the annual temperature numbers and on a given year's ranking, but usually that misses the point," said James Hansen, the director of GISS, which conducts a similar analysis each year. "There's substantial year-to-year variability of global temperature. But when we average temperature over five or ten years to minimize that variability, we find that global warming is continuing unabated."

2008 was the coolest year of the decade, due to strong cooling of the tropical Pacific Ocean, but 2009 saw a return to near-record global temperatures. The past year’s temperatures were only barely surpassed by 2005, the warmest recorded year, and tied with a cluster of others — 1998, 2002, 2003, 2006 and 2007 — as second warmest. Through the last three decades, the GISS surface temperature record shows an upward trend of about 0.2°C (0.36°F) per decade. The greatest increases in 2000-2009 came in the Arctic and part of Antarctica. Except for a leveling off between the 1940s and 1970s, Earth’s surface temperatures have increased steadily since 1880, when modern scientific instrumentation became available to monitor temperatures precisely. In total, average global temperatures have increased by about 0.8°C (1.5°F) since 1880.

At least in the United States, the near-record temperature of 2009 would seem to be all the more striking because it includes a notably frigid December in much of North America, with heavy snowstorms and freezing of crops in the south. High air pressures in the Arctic decreased the east-west flow of the jet stream, while also increasing its tendency to blow from north to south. This resulted in an unusual effect, in which cold air from the Arctic rushed into North America and warmer mid-latitude air shifted toward the north. Of course, the contiguous 48 states cover only 1.5 percent of the world area, so the U.S. temperature does not affect the global temperature much,' said Hansen.

The long-term rise since 1880 is “the important number to keep in mind," said Gavin Schmidt, another GISS climatologist. "In contrast, the difference between, say, the second and sixth warmest years is trivial since the known uncertainty, or noise, in the temperature measurement is larger than some of the differences between the warmest years."

Climate scientists agree that rising levels of carbon dioxide and other greenhouse gases trap incoming heat near the surface of the earth and are the key factors causing the rise in temperatures. However, these gases are not the only factors that can affect global temperatures. Other key factors--changes in the sun's irradiance, oscillations of tropical sea-surface temperature such as the El Niño-La Niña cycle, and changes in the levels of airborne particles — can also cause slight increases or decreases in the planet's temperature. But overall, the evidence suggests that these effects are not enough to account for the global warming observed since 1880.

El Niño and La Niña are prime examples of how the oceans can affect global temperatures. These terms describe abnormally warm or cool sea surface temperatures in the South Pacific, caused by changing ocean currents. Global temperatures tend to decrease in the wake of La Niña, which occurs when upwelling cold water off the coast of Peru spreads westward in the equatorial Pacific Ocean. La Niña lingered during the early months of 2009 and gave way to the beginning of an El Niño phase in October that is expected to continue in 2010. An especially powerful El Niño cycle in 1998 is thought to have contributed to the unusually high temperatures that year, and Hansen's group estimates that there is a good chance 2010 will be the warmest year on record if the current El Niño persists. At most, scientists estimate that El Niño and La Niña can cause global temperatures to deviate by about 0.2°C (0.36°F).

Warmer surface temperatures also tend to occur during particularly active parts of the solar cycle, known as solar maximums, while slightly cooler temperatures occur during lulls in activity, called minimums. A deep solar minimum has made sunspots a rarity in the last few years. Such lulls in solar activity, which can cause the total amount of energy given off by the sun to decrease by about a tenth of a percent, typically spur surface temperature to dip slightly. Overall, solar minimums and maximums are thought to produce no more than 0.1°C (0.18°F) of cooling or warming.

"In 2009, it was clear that even the deepest solar minimum in the period of satellite data hasn't stopped global warming from continuing," said Hansen.

Small particles in the atmosphere, called aerosols, can also affect the climate. Volcanoes are powerful sources of sulfate aerosols that counteract global warming by reflecting incoming solar radiation back into space. In recent years, large eruptions at Mount Pinatubo in the Philippines and El Chichón in Mexico have caused global dips in surface temperature of as much as 0.3°C (0.54°F). But volcanic eruptions in 2009 did not have a significant impact. Meanwhile, other types of aerosols, often produced by burning fossil fuels, can change surface temperatures by either reflecting or absorbing incoming sunlight. Hansen's group estimates that aerosols probably counteract about half of the warming produced by manmade greenhouse gases, but he cautions that better measurements of these elusive particles are needed.

To conduct its analysis, GISS uses publicly available information from three sources: weather data from more than 1,000 meteorological stations around the world; satellite observations of sea-surface temperatures; and Antarctic research station measurements. These three data sets are loaded into a computer program (available for public download from the GISS website). The program calculates trends in temperature anomalies — not absolute temperatures, but rather changes relative to the average temperature for the same month during the period of 1951-1980. Other research groups also track global temperature trends, but use different analysis techniques. The Met Office Hadley Centre, based in the United Kingdom, uses similar input measurements as GISS, for example, but omits large areas of the Arctic and Antarctic, where monitoring stations are sparse. Although the two methods produce slightly different results in the annual rankings, the decade-long trends in the two records are essentially identical. Previously, the World Meteorological Organization issued a similar report in December 2009, saying that 2000-2009 would rank as the warmest decade on record, and 2009 would probably be among the warmest years.

"There's a contradiction between the results shown here and popular perceptions about climate trends," Hansen said. "In the last decade, global warming has not stopped."

(Photo: NASA)

The Earth Institute, Columbia University

POWER FROM DOWN UNDER

0 comentarios

Grants recently awarded to MIT researchers by the U.S. Department of Energy (DoE) could help to pave the way for a method of generating electricity that produces no greenhouse gas emissions, and that could become a major contributor to meeting the world’s energy needs.

Most energy analysts agree that geothermal energy — tapping the heat of bedrock deep underground to generate electricity — has enormous potential because it is available all the time, almost anywhere on Earth, and there is enough of it available, in theory, to supply all of the world’s energy needs for many centuries. But there are still some unanswered questions about it that require further research. DoE last year awarded $336 million in grants to help resolve the remaining uncertainties, and three of those grants, totaling more than $2 million, went to MIT researchers.

Everywhere on Earth, a few miles below the surface, the bedrock is hot, and the deeper you go the hotter it gets. In some places, water heated by this hot rock comes naturally to the surface or close to it, where it can be easily tapped to drive a turbine and generate electricity.

But where naturally heated water is not available at or near the surface, this process can be recreated by drilling one very deep well to inject water into the ground, and another well nearby to pump that water back to the surface after it has been heated by passing through cracks in the hot rock. Such systems are known as Engineered Geothermal Systems, or EGS.

A 2006 report by an 18-member team led by MIT Professor Jefferson Tester (now emeritus, and working at Cornell University) found that more than 2,000 times the total annual energy use of the United States could be supplied, using existing technology, from EGS systems, and perhaps 10 times as much with improved technology.

Herbert Einstein, professor of civil and environmental engineering, was the recipient of one of the new DoE grants. Einstein studies fracturing in rocks, which is crucial in creating a new EGS site: After drilling the necessary wells, water must be pumped into one of them under very high pressure to create a network of fractures in the deep rock to allow the water to move from the injection well to the extraction well. But exactly how that process works at different depths and in different types of rock is not yet well understood.

Einstein is developing computer programs that can aid in the evaluation of geothermal sites, assessing both the potential power output and any potential risks, such as the triggering of seismic activity. Such triggering has already resulted in the premature closing two years ago of one test installation, in Basel, Switzerland, after some minor earthquakes (the largest being magnitude 3.4) were felt in the area.

The planned software is based on programs Einstein has developed to assess proposed tunnel sites and landslide risks. “What these decision tools do is allow you to consider the uncertainties, of which there are a lot,” he says.

As is the case with tunnel construction, a great deal of the uncertainty with EGS has to do with the kind of rock encountered in the drilling and how that rock will fracture under pressure. Einstein’s software will be adapted to address the higher pressures encountered in the very deep boreholes needed for geothermal fields.

Einstein suggests that the risks from seismic triggering are largely sociological, because the events seen so far, at least, are too small to produce any real damage.

“I think that’s a red herring,” agrees Professor of Geophysics M. Nafi Toksoz, another DoE geothermal grant recipient, referring to the issue of induced earthquakes. “We know that every time we drill into the Earth, we alter the state of the stress in the rock.” As a result, small earthquakes do occur regularly near oil and gas wells, deep mine shafts for coal or minerals, and even from the pressure of water when a reservoir fills up behind a new dam. “Wherever there are existing faults, they will induce mostly minor quakes.”

Toksoz’s grant (with research scientists Michael Fehler and Haijian Zhang of MIT’s Department of Earth, Atmospheric and Planetary Sciences as collaborators) will fund research at a test EGS installations in Utah, Nevada and California to develop ways of detecting and analyzing the fractures that form in the deep rock and how water actually flows through them.

“Enhanced or Engineered Geothermal Systems (EGS) can be a enormous contributor to the world’s renewable energy portfolio,” says Curt Robinson, executive director of the Geothermal Resources Council in Davis, Calif., a nonprofit educational group. He says EGS could play a significant role in meeting energy needs if there are better ways of analyzing potential sites to improve the odds of success; government policies to create a favorable business climate for investors; and better technologies for identifying good sites and for lower-cost drilling under high-temperature conditions.

Einstein says geothermal electricity has the potential to take the place of essentially all stationary (that is, not transportation-related) power sources. “Basically, you could replace everything that’s around,” including the “baseload” power plants that can operate at any time, unlike sources such as solar or wind power. “So that’s certainly very promising,” he says. “It’s not completely infinite, but for all practical purposes it is.”

One of the remaining questions in practice is whether an EGS plant will lose efficiency over time, as minerals carried by the water begin to clog up the cracks in the deep rock. While test plants have been operated in the United States and elsewhere for limited amounts of time, there has not yet been such a plant that has operated over a span of years, so “we don’t know how long these things will work at their maximum output,” Einstein says, and if their performance begins to drop, “can you restimulate the well?” to get it back to original levels. These questions require further research.

Seismologist Fehler, recipient of another of the DoE grants, uses small earthquakes as a tool: Like ultrasound used to get images inside the body, the natural vibrations from small earthquakes can be used as a way to probe the subsurface to understand how water is moving deep below the ground. “It’s a remote-sensing tool,” he says.

This is similar to a method used in oil exploration, where the subsurface is analyzed by measuring the way vibrations from explosions or surface “thumpers” are distributed through the soil and rock below. An array of microphones or micro-seismometers picks up the vibrations at various points, and computers then reconstruct an image of the subsurface from the relative timing of the vibrations from the source to the receiver.

At most geothermal installations built so far, Fehler says, earthquakes have been so small that “you can record them, but you don’t feel anything at the surface.” But seismic triggering is an issue because it has affected companies’ ability to continue operations because of social, economic and political factors, he says. “We have to figure out how to try to control it,” he says, mostly by choosing sites carefully, away from population centers. The U.S. Department of Energy is holding a workshop on that question in February.

The basic principles have been demonstrated. “We know it can be done,” Toksoz says. “But quite a lot of technology still needs to be proven in terms of commercial feasibility.” The remaining questions are essentially economic and engineering ones related to the costs and difficulties of deep drilling, not basic technology, he says.

(Photo: Geothermal Education Office, Tiburon, California)

Massachusetts Institute of Technology

GEOSCIENTISTS DRILL DEEPEST HOLE IN OCEAN CRUST IN SCIENTIFIC OCEAN DRILLING HISTORY

0 comentarios

For eight weeks beginning in November 2009, off the coast of New Zealand, an international team of 34 scientists and 92 support staff and crew on board the scientific drilling vessel JOIDES Resolution (JR) were at work investigating sea-level change in a region called the Canterbury Basin. It proved to be a record-breaking trip for the research team.

The JR is one of the primary research vessels of an international research program called the Integrated Ocean Drilling Program (IODP). This research took place during IODP Expedition 317.

IODP is supported by two lead agencies, the U.S. National Science Foundation (NSF) and Japan's Ministry of Education, Culture, Sports, Science, and Technology.

At present 10 percent of the world's population lives within 10 meters of sea level. Current climate models predict a 50-centimeter to more than one-meter rise in sea level over the next 100 years, posing a threat to inhabitants of low-lying coastal communities around the world.

To better understand what drives changes in sea level and how humans are affecting this change, scientists are "looking to our past for answers and digging back as far as 35 million years into the Earth's history to understand these dynamic processes," says Rodey Batiza of the NSF's division of ocean sciences.

From November 4, 2009 to January 4, 2010, the IODP research team drilled four sites in the seafloor. One site marked the deepest hole drilled by the JR on the continental shelf (1,030 meters), and another was the deepest hole drilled on a single expedition in the history of scientific ocean drilling (1,927 meters).

Another record was broken for the deepest sample taken by scientific ocean drilling for microbiological studies (1,925 meters).

A fourth record was achieved when the team recovered sediment from the shallowest water site (85 meters) ever drilled for science by the JR.

"This was one of only two JR expeditions that have attempted to drill on a continental shelf--this was not a routine operation for this ship," says co-chief scientist Craig Fulthorpe of the University of Texas at Austin, who led the expedition with co-chief scientist Koichi Hoyanagi of Shinshu University in Japan.

The unstable, sandy composition of the sediments and the relatively shallow water of the shelf environment present unique challenges for a floating drilling platform like the JR, which relies on thrusters to maintain position and requires special technology to accommodate wave motion.

"We never expected we would be able to drill this deep in such a difficult environment," says Fulthorpe.

Canterbury Basin is part of a worldwide array of IODP drilling investigations designed to examine global sea level changes during Earth's "Icehouse" period, when sea level was largely controlled by changes in glaciation at the poles.

Before Canterbury, IODP sea level change studies included sites near the New Jersey coast, the Bahamas, Tahiti, and on the Marion Plateau off northeastern Australia.

Canterbury Basin was selected as a premier site for further sea level history investigations because it expanded the geographic coverage needed to study a global process. It displays similar sequence patterns to New Jersey studies.

Data from both the Canterbury Basin and the New Jersey shelf IODP expeditions will be integrated to provide a better understanding of global trends in sea level over time.

Global sea level has changed in the Earth's past; these changes are influenced by the melting of polar ice caps, which increases the volume of water in the ocean.

Locally, relative sea level can also change as a result of tectonic activity, which causes vertical movement in the Earth's crust.

Together, glaciation and tectonic forces create a complex system that can be difficult to simulate with climate models. This necessitates field studies like the Canterbury Basin expedition, say geologists, to directly analyze samples.

The Canterbury Basin expedition set out to recover seafloor sediments that would capture a detailed record of changes in sea level that occurred during the last 10 to 12 million years, a time when global sea level change was largely controlled by glacial/interglacial ice volume changes.

The research team also recovered samples documenting changes in ocean circulation that began when movements in Earth's tectonic plates separated Antarctica from Australia, creating a new seaway between the two continents about 34 million years ago.

Canterbury Basin is one of the best sites in the world for this type of survey because it is located in a tectonically-active region and therefore has a relatively high rate of sedimentary deposition, which, like the pages of a book, record detailed events in Earth's climate history.

Beyond breaking records, the IODP Canterbury Basin expedition achieved its goal of recovering a 10-million-year record of sea level fluctuations, with one drill hole extending back 35 million years.

Cores revealed cyclic changes in sediment type and physical properties (such as magnetic susceptibility) that are believed to reflect switches between glacial and interglacial time periods.

Even longer cycles were originally identified using seismic images generated using sound waves.

Understanding the relationship between these seismic "sequences" and global sea-level change is an important objective for post-expedition research, say expedition geologists.

(Photo: William Crawford, IODP/TAMU)

National Science Foundation (NSF)

MICROBES PRODUCE FUELS DIRECTLY FROM BIOMASS

0 comentarios

A collaboration led by researchers with the U.S. Department of Energy’s Joint BioEnergy Institute (JBEI) has developed a microbe that can produce an advanced biofuel directly from biomass. Deploying the tools of synthetic biology, the JBEI researchers engineered a strain of Escherichia coli (E. coli) bacteria to produce biodiesel fuel and other important chemicals derived from fatty acids.

“The fact that our microbes can produce a diesel fuel directly from biomass with no additional chemical modifications is exciting and important,” says Jay Keasling, the Chief Executive Officer for JBEI, and a leading scientific authority on synthetic biology. “Given that the costs of recovering biodiesel are nowhere near the costs required to distill ethanol, we believe our results can significantly contribute to the ultimate goal of producing scalable and cost effective advanced biofuels and renewable chemicals.”

Keasling led the collaboration, which was made up of a team from JBEI’s Fuels Synthesis Division that included Eric Steen, Yisheng Kang and Gregory Bokinsky, and a team from LS9, a privately-held industrial biotechnology firm based in South San Francisco. The LS9 team was headed by Stephen del Cardayre and included Zhihao Hu, Andreas Schirmer and Amy McClure. The collaboration has published the results of their research in the January 28, 2010 edition of the journal Nature. The paper is titled, “Microbial Production of Fatty Acid-Derived Fuels and Chemicals from Plant Biomass.”

A combination of ever-increasing energy costs and global warming concerns has created an international imperative for new transportation fuels that are renewable and can be produced in a sustainable fashion. Scientific studies have consistently shown that liquid fuels derived from plant biomass are one of the best alternatives if a cost-effective means of commercial production can be found. Major research efforts to this end are focused on fatty acids – the energy-rich molecules in living cells that have been dubbed nature’s petroleum.

Fuels and chemicals have been produced from the fatty acids in plant and animal oils for more than a century. These oils now serve as the raw materials not only for biodiesel fuel, but also for a wide range of important chemical products including surfactants, solvents and lubricants.

“The increased demand and limited supply of these oils has resulted in competition with food, higher prices, questionable land-use practices and environmental concerns associated with their production,” Keasling says. “A more scalable, controllable, and economic alternative route to these fuels and chemicals would be through the microbial conversion of renewable feedstocks, such as biomass-derived carbohydrates.”

E. coli is a well-studied microorganism whose natural ability to synthesize fatty acids and exceptional amenability to genetic manipulation make it an ideal target for biofuels research. The combination of E. coli with new biochemical reactions realized through synthetic biology, enabled Keasling, Steen and their colleagues to produce structurally tailored fatty esters (biodiesel), alcohols and waxes directly from simple sugars.

“Biosynthesis of microbial fatty acids produces fatty acids bound to a carrier protein, the accumulation of which inhibits the making of additional fatty acids,” Steen says. “Normally E. coli doesn’t waste energy making excess fat, but by cleaving fatty acids from their carrier proteins, we’re able to unlock the natural regulation and make an abundance of fatty acids that can be converted into a number of valuable products. Further, we engineered our E. coli to no longer eat fatty acids or use them for energy.”

After successfully diverting fatty acid metabolism toward the production of fuels and other chemicals from glucose, the JBEI researchers engineered their new strain of E. coli to produce hemicellulases – enzymes that are able to ferment hemicellulose, the complex sugars that are a major constituent of cellulosic biomass and a prime repository for the energy locked within plant cell walls.

“Engineering E. coli to produce hemicellulases enables the microbes to produce fuels directly from the biomass of plants that are not used as food for humans or feed for animals,” Steen says. “Currently, biochemical processing of cellulosic biomass requires costly enzymes for sugar liberation. By giving the E. coli the capacity to ferment both cellulose and hemicellulose without the addition of expensive enzymes, we can improve the economics of cellulosic biofuels.”

The JBEI team is now working on maximizing the efficiency and the speed by which their engineered strain of E. coli can directly convert biomass into biodiesel. They are also looking into ways of maximizing the total amount of biodiesel that can be produced from a single fermentation.

“Productivity, titer and efficient conversion of feedstock into fuel are the three most important factors for engineering microbes that can produce biofuels on an industrial scale,” Steen says. “There is still much more research to do before this process becomes commercially feasible.”

(Photo: Jonathan Remis, JBEI)

Lawrence Berkeley Lab Center

DRAMATIC TRANSFORMATION: RESEARCHERS DIRECTLY TURN MOUSE SKIN CELLS INTO NEURONS, SKIPPING IPS STAGE

0 comentarios

Even Superman needed to retire to a phone booth for a quick change. But now scientists at the Stanford University School of Medicine have succeeded in the ultimate switch: transforming mouse skin cells in a laboratory dish directly into functional nerve cells with the application of just three genes. The cells make the change without first becoming a pluripotent type of stem cell — a step long thought to be required for cells to acquire new identities.

The finding could revolutionize the future of human stem cell therapy and recast our understanding of how cells choose and maintain their specialties in the body.

“We actively and directly induced one cell type to become a completely different cell type,” said Marius Wernig, MD, assistant professor of pathology and a member of Stanford’s Institute for Stem Cell Biology and Regenerative Medicine. “These are fully functional neurons. They can do all the principal things that neurons in the brain do.” That includes making connections with and signaling to other nerve cells — critical functions if the cells are eventually to be used as therapy for Parkinson’s disease or other disorders.

Wernig is the senior author of the research, published online Jan. 27 in Nature. Graduate student Thomas Vierbuchen is the lead author.

Although previous research has suggested that it’s possible to coax specialized cells to exhibit some properties of other cell types, this is the first time that skin cells have been converted into fully functional neurons in a laboratory dish. The change happened within a week and with an efficiency of up to nearly 20 percent. The researchers are now working to duplicate the feat with human cells.

“This study is a huge leap forward,” said Irving Weissman, MD, director of Stanford’s Institute for Stem Cell Biology and Regenerative Medicine. “The direct reprogramming of these adult skin cells into brain cells that can show complex, appropriate behaviors like generating electrical currents and forming synapses establishes a new method to study normal and disordered brain cell function. Finally we may be able to capture and study conditions like Parkinson’s or Alzheimer’s or heritable mental diseases in the laboratory dish for the first time.”

Until recently, it’s been thought that cellular specialization, or differentiation, was a one-way path: pluripotent embryonic stem cells give rise to all the cell types in the body, but as the daughter cells become more specialized, they also become more biologically isolated. Like a tree trunk splitting first into branches and then into individual leaves, the cells were believed to be consigned to one developmental fate by physical modifications — called epigenetic changes — added to their DNA along the way. A skin cell could no more become a nerve cell than a single leaf could flit from branch to branch or Superman could become Clark Kent in midair.

That view began to change when Dolly the sheep was cloned from an adult cell in 1997, showing that, under certain conditions, a specialized cell could shed these restrictions and act like an embryonic stem cell.

And in 2007, researchers announced the creation of induced pluripotent stem cells, or iPS cells, from human skin cells by infecting them with four stem-cell-associated proteins called transcription factors. Once the cells had achieved a pluripotent state, the researchers coaxed them to develop into a new cell type. The process was often described in concept as moving the skin cells backward along the differentiation pathway (in the leaves analogy, reversing down the branch to the tree’s trunk) and then guiding them forward again along a different branch into a new lineage.

Finally, in 2008, Doug Melton, PhD, a co-director of Harvard’s Stem Cell Institute, showed it was possible in adult mice to reprogram one type of cell in the pancreas to become another pancreatic cell type by infecting them with a pool of viruses expressing just three transcription factors.

As a result, Wernig, who as a postdoctoral fellow in Rudolf Jaenisch’s laboratory at the Whitehead Institute in Massachusetts participated in the initial development of iPS cells, began to wonder whether the pluripotent pit stop was truly necessary. Thomas Südhof, the Avram Goldstein Professor in the Stanford School of Medicine, also collaborated on the research.

To test the theory, Wernig, Vierbuchen and graduate student Austin Ostermeier amassed a panel of 19 genes involved in either epigenetic reprogramming or neural development and function. They used a virus called a lentivirus to infect skin cells from embryonic mice with the genes, and then monitored the cells’ response. After 32 days they saw that some of the former skin cells now looked like neural cells and expressed neural proteins.

The researchers, which included postdoctoral scholar Zhiping Pang, PhD, used a mix-and-match approach to winnow the original pool of 19 genes down to just three. They also tested the procedure on skin cells from the tails of adult mice. They found that about 20 percent of the former skin cells transformed into neural cells in less than a week. That may not, at first, sound like a quick change, but it is vast improvement over iPS cells, which can take weeks. What’s more, the iPS process is very inefficient: Usually only about 1 to 2 percent of the original cells become pluripotent.

In Wernig’s experiments, the cells not only looked like neurons, they also expressed neural proteins and even formed functional synapses with other neurons in laboratory dish.

“We were very surprised by both the timing and the efficiency,” said Wernig. “This is much more straightforward than going through iPS cells, and it’s likely to be a very viable alternative.” Quickly making neurons from a specific patient may allow researchers to study particular disease processes such as Parkinson’s in a laboratory dish, or one day to even manufacture cells for therapy.

The research suggests that the pluripotent stage, rather than being a required touchstone for identity-shifting cells, may simply be another possible cellular state. Wernig speculates that finding the right combination of cell-fate-specific genes may trigger a domino effect in the recipient cell, wiping away restrictive DNA modifications and imprinting a new developmental fate on the genomic landscape.

“It may be hard to prove,” said Wernig, “but I no longer think that the induction of iPS cells is a reversal of development. It’s probably more of a direct conversion like what we’re seeing here, from one cell type to another that just happens to be more embryonic-like. This tips our ideas about epigenetic regulation upside down.”

(Photo: Tommy Vierbuchen)

Stanford University

KU RADAR SYSTEM PROVIDES 3D IMAGE OF EARTH THROUGH MILES OF ICE

0 comentarios
In the cover article of the latest issue of the Journal of Glaciology, engineers at the University of Kansas detail a special radar array they developed that is capable of depicting a 3D view of bedrock hidden beneath ice sheets three kilometers thick.

Working at the National Science Foundation Center for Remote Sensing of Ice Sheets at KU, the researchers, led by then-doctoral student John Paden, designed a Synthetic Aperture Radar system that provides a fine-resolution image of the bed over a wide geographic region, as well as the thickness of the ice. Topographic characteristics of earth below the glaciers and ice sheets have long been sought after. It is considered essential to develop computer models that can better predict the role of ice sheets in global climate change and sea level rise.

The radar system is a game-changing development for researchers in global climate change. Previously, glaciologists could only know the thickness of the ice and bed conditions along a single line from a sole pass of a radar or at a single point where ice core samples had been drilled.

To get these innovative 3D landscapes, KU engineers constructed a special sled with several radar transmitters and receivers. The sled was then hauled over Summit Camp, a year-round science station that sits on top of the ice sheet in Greenland, following a precise grid of parallel lines 500 meters apart connected by perpendicular lines. The radar used both left-looking and right-looking beams in the frequency needed to broadcast television signals. Researchers used advanced signal-processing techniques that determined the directions of the echoing wavelengths. The team was able to collect data through ice as much as three kilometers thick and reveal the ground from multiple points.

Because the same spot on the ice bed is imaged by radar from several tracks, the elevation is independently measured multiple times, Paden wrote in his paper. Paden received his doctorate in electrical engineering from KU in 2007. He’s now employed as a software development engineer with Microsoft’s Vexcel Corporation.

“While the 2D representations provide a consistent medium for comparison of point differences, a 3D representation provides better visualization and interpretation of surface features,” Paden wrote. To confirm accuracy, researchers compared their result with the length of the 3,027-meter-long Greenland Ice Core Project ice borehole and found their data to be within 10 meters at that site. The radar system is considered very accurate with an error of 0.3 percent in the index of refraction, well within acceptable limits for ice-penetrating radar.

The Intergovernmental Panel on Climate Change has estimated that sea level would increase from 18 to 59 centimeters over the next century. The success of this radar system will help researchers create more accurate ice-sheet models that predict sea level rise, Paden wrote. The new radar also will help glaciologist identify locations ideal for future ice core sampling.

Kansas University

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com