Tuesday, August 25, 2009

NANOELECTRONIC TRANSISTOR COMBINED WITH BIOLOGICAL MACHINE COULD LEAD TO BETTER ELECTRONICS

0 comentarios

If manmade devices could be combined with biological machines, laptops and other electronic devices could get a boost in operating efficiency.

Lawrence Livermore National Laboratory researchers have devised a versatile hybrid platform that uses lipid-coated nanowires to build prototype bionanoelectronic devices.

Mingling biological components in electronic circuits could enhance biosensing and diagnostic tools, advance neural prosthetics such as cochlear implants, and could even increase the efficiency of future computers.

While modern communication devices rely on electric fields and currents to carry the flow of information, biological systems are much more complex. They use an arsenal of membrane receptors, channels and pumps to control signal transduction that is unmatched by even the most powerful computers. For example, conversion of sound waves into nerve impulses is a very complicated process, yet the human ear has no trouble performing it.

“Electronic circuits that use these complex biological components could become much more efficient,” said Aleksandr Noy, the LLNL lead scientist on the project.

While earlier research has attempted to integrate biological systems with microelectronics, none have gotten to the point of seamless material-level incorporation.

“But with the creation of even smaller nanomaterials that are comparable to the size of biological molecules, we can integrate the systems at an even more localized level,” Noy said.

To create the bionanoelectronic platform the LLNL team turned to lipid membranes, which are ubiquitous in biological cells. These membranes form a stable, self-healing, and virtually impenetrable barrier to ions and small molecules.

“That's not to mention that these lipid membranes also can house an unlimited number of protein machines that perform a large number of critical recognition, transport and signal transduction functions in the cell,” said Nipun Misra, a UC Berkeley graduate student and a co-author on the paper.

Julio Martinez, a UC Davis graduate student and another co-author added: “Besides some preliminary work, using lipid membranes in nanoelectronic devices remains virtually untapped.”

The researchers incorporated lipid bilayer membranes into silicon nanowire transistors by covering the nanowire with a continuous lipid bilayer shell that forms a barrier between the nanowire surface and solution species.

“This 'shielded wire' configuration allows us to use membrane pores as the only pathway for the ions to reach the nanowire,” Noy said. “This is how we can use the nanowire device to monitor specific transport and also to control the membrane protein.”

The team showed that by changing the gate voltage of the device, they can open and close the membrane pore electronically.

(Photo: Scott Dougherty, LLNL)

Lawrence Livermore National Laboratory

CLIMATE MODELS CONFIRM MORE MOISTURE IN ATMOSPHERE ATTRIBUTED TO HUMANS

0 comentarios

When it comes to using climate models to assess the causes of the increased amount of moisture in the atmosphere, it doesn't much matter if one model is better than the other. They all come to the same conclusion: Humans are warming the planet, and this warming is increasing the amount of water vapor in the atmosphere.

In new research appearing in the Aug. 10 online issue of the Proceedings of the U.S. National Academy of Sciences, Lawrence Livermore National Laboratory scientists and a group of international researchers found that model quality does not affect the ability to identify human effects on atmospheric water vapor.

“Climate model quality didn't make much of a difference,” said Benjamin Santer, lead author from LLNL's Program for Climate Modeling and Intercomparison. “Even with the computer models that performed relatively poorly, we could still identify a human effect on climate. It was a bit surprising. The physics that drive changes in water vapor are very simple and are reasonably well portrayed in all climate models, bad or good.”

The atmosphere's water vapor content has increased by about 0.4 kilograms per cubic meter (kg/m3) per decade since 1988, and natural variability alone can't explain this moisture change, according to Santer. “The most plausible explanation is that it's due to human-caused increases in greenhouse gases,” he said.

More water vapor - which is itself a greenhouse gas - amplifies the warming effect of increased atmospheric levels of carbon dioxide.

Previous LLNL research had shown that human-induced warming of the planet has a pronounced effect on the atmosphere's total moisture content. In that study, the researchers had used 22 different computer models to identify a human “fingerprint” pattern in satellite measurements of water vapor changes. Each model contributed equally in the fingerprint analysis. “It was a true model democracy,” Santer said. “One model, one vote.”

But in the recent study, the scientists first took each model and tested it individually, calculating 70 different measures of model performance. These “metrics” provided insights into how well the models simulated today's average climate and its seasonal changes, as well as on the size and geographical patterns of climate variability.

This information was used to divide the original 22 models into various sets of “top ten” and “bottom ten” models. “When we tried to come up with a David Letterman type 'top ten' list of models,” Santer said, “we found that it's extremely difficult to do this in practice, because each model has its own individual strengths and weaknesses.”

Then the group repeated their fingerprint analysis, but now using only “top ten” or “bottom ten” models rather than the full 22 models. They did this more than 100 times, grading and ranking the models in many different ways. In every case, a water vapor fingerprint arising from human influences could be clearly identified in the satellite data.

“One criticism of our first study was that we were only able to find a human fingerprint because we included inferior models in our analysis,” said Karl Taylor, another LLNL co-author. “We've now shown that whether we use the best or the worst models, they don't have much impact on our ability to identify a human effect on water vapor.”

This new study links LLNL's “fingerprint” research with its long-standing work in assessing climate model quality. It tackles the general question of how to make best use of the information from a large collection of models, which often perform very differently in reproducing key aspects of present-day climate. This question is not only relevant for “fingerprint” studies of the causes of recent climate change. It is also important because different climate models show different levels of future warming. Scientists and policymakers are now asking whether we should use model quality information to weight these different model projections of future climate change.

“The issue of how we are going to deal with models of very different quality will probably become much more important in the next few years, when we look at the wide range of models that are going to be used in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change,” Santer said.

(Photo: ECMWF)

Lawrence Livermore National Laboratory

EARLY HUMAN HUNTERS HAD FEWER MEAT-SHARING RITUALS

0 comentarios

A University of Arizona anthropologist has discovered that humans living at a Paleolithic cave site in central Israel between 400,000 and 250,000 years ago were as successful at big-game hunting as were later stone-age hunters at the site, but that the earlier humans shared meat differently.

"The Lower Paleolithic (earlier) hunters were skilled hunters of large game animals, as were Upper Paleolithic (later) humans at this site," UA anthropology professor Mary C. Stiner said.

"This might not seem like a big deal to the uninitiated, but there's a lot of speculation as to whether people of the late Lower Paleolithic were able to hunt at all, or whether they were reduced to just scavenging," Stiner said. "Evidence from Qesem Cave says that just like later Paleolithic humans, the earlier Paleolithic humans focused on harvesting large game. They were really at the top of the food chain."

The Qesem Cave people hunted cooperatively, then carried the highest quality body parts of their prey to the cave, where they cut the meat with stone blade cutting tools and cooked it with fire.

"Qesem" means "surprise." The cave was discovered in hilly limestone terrain about seven miles east of Tel-Aviv not quite nine years ago, during road construction. Stiner was invited by Ran Barkai and Avi Gopher of Tel Aviv University's Institute of Archaeology to participate in the Qesem Cave Project.

Stiner analyzed the pattern of cut marks on bones of deer, aurochs, horse and other big game left at Qesem Cave by hunters of 400,000 to 200,000 years ago. Her novel approach was to analyze the cut marks to understand meat-sharing behaviors between the earlier and later cooperative hunting societies.

And the patterns revealed a striking difference in meat-sharing behaviors: The earlier hunters were less efficient, less organized and less specialized when it came to carving flesh from their prey.

"This is somewhat expected, since the tools they made took considerable skill and locomotor precision to produce," Stiner said.

Random cut marks, and higher numbers of cut marks, made by the earlier hunters show they attached little social ritual or formal rules to sharing meat, Stiner said. Many hands, including unskilled hands, cut meat off the bone during feeding.

By contrast, by later times, by the Middle and Upper Paleolithic, "It's quite clear that meat distribution flowed through the hands of certain butchers," Stiner said. "The tool marks made on bones by the more recent hunters are very regular, very efficient and show much less variation in the postures of the individuals cutting meat from any one bone. Only certain hunters or other fairly skilled individuals cut meat that was to be shared among the group."

Stiner stresses that her new findings need to be more broadly replicated before the implications of her research can be widely accepted.

Meat is one of the highest quality foods that humans may eat, and it is among the most difficult resources to harvest from the environment.

Archaeologists know that the roots of carnivory stretch deep into the past. But the details of carnivory and meat sharing have been sketchy. And they are important details, because they reflect the evolutionary development in human economic and social behaviors.

"It's interesting that these earlier people were skilled predators and very social, but that their social rules are more basic, less derived than those of the Middle Paleolithic.

"What might surprise most archaeologists is that I'm seeing a big difference between Lower and Middle Paleolithic social behaviors, not between Middle and Upper Paleolithic social behaviors.

"Neanderthals lived in the Middle Paleolithic, and they were a lot more like us in their more formal redistributions of meat than were the earlier hominids."

(Photo: Qesem Cave Project, Tel Aviv University)

University of Arizona

VARIABILITY OF TYPE 1A SUPERNOVAE HAS IMPLICATIONS FOR DARK ENERGY STUDIES

0 comentarios

The stellar explosions known as type 1a supernovae have long been used as "standard candles", their uniform brightness giving astronomers a way to measure cosmic distances and the expansion of the universe. But a new study published in Nature reveals sources of variability in type 1a supernovae that will have to be taken into account if astronomers are to use them for more precise measurements in the future.

The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae. But in order to probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to measure cosmic distances with much greater precision than they have in the past.

"As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance," said lead author Daniel Kasen, a Hubble postdoctoral fellow at the University of California, Santa Cruz. "We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness."

Kasen and his coauthors--Fritz Röpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz--used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

"Since we are beginning to understand how type 1a supernovae work from first principles, these models can be used to refine our distance estimates and make measurements of the expansion rate of the universe more precise," Woosley said.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass--1.4 times the mass of the Sun, packed into an object the size of the Earth--the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their "light curves" (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

"Since ignition does not occur in the dead center, and since detonation occurs first at some point near the surface of the exploding white dwarf, the resulting explosions are not spherically symmetric," Woosley explained. "This could only be studied properly using multi-dimensional calculations."

Most previous studies have used one-dimensional models in which the simulated explosion is spherically symmetric. Multi-dimensional simulations require much more computing power, so Kasen's group ran most of their simulations on the powerful Jaguar supercomputer at Oak Ridge National Laboratory, and also used supercomputers at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. The results of two-dimensional models are reported in the Nature paper, and three-dimensional studies are currently under way.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. "The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry," Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

"The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light," Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. "Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based," Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher "metallicity," in astronomers' terminology) than stars formed in the distant past.

"That's the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity," Kasen said. "When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less."

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

(Photo: F. Ropke)

University of California, Santa Cruz

HURRICANE SEASONS ARE MORE ACTIVE

0 comentarios

For many Americans who live on the Atlantic coast, Andrew, Ivan and Katrina are more than just names--they are reminders of the devastating impact of cyclonic activity in the region during hurricane season. If it seems like hurricane seasons have been more active in recent years, you're on to something. According to a paper published in the August 13 issue of Nature, the frequency and strength of these powerful storms has grown in recent decades.

"We are at levels now that are about as high as anything we have seen in the past 1,000 years," said Michael Mann, director of the Earth System Science Center at Pennsylvania State University and the lead author of the paper. Mann and his collaborators, Jeffrey P. Donnelly of the Woods Hole Oceanographic Institution, Jonathan D. Woodruff of the University of Massachusetts and Zhihua Zhang of Pennsylvania State University examined sediment samples from across the North Atlantic coast and statistical models of historic hurricane activities.

Their analysis allowed them to measure the severity of hurricane seasons over the past 1,500 years. The sediment samples match up relatively well with the computer models, both of which show a period of high activity around 1,000 AD, followed by a lull in activity. This medieval peak rivals and possibly exceeds the level of activity seen in recent decades.

The study also adds validity to the theory that two factors fuel higher hurricane activity, namely the La Niña effect and high surface temperatures over the ocean. If climate change continues to warm ocean waters, Mann said, it could lead to more active hurricane seasons. This hurricane season, which has yet to see a named storm, is lighter than usual, Mann said, because of the El Niño effect, which is believed to have the opposite effect of La Nina patterns.

(Photo: Jon Woodruff)

National Science Foundation

CAMERA FLASH TURNS AN INSULATING MATERIAL INTO A CONDUCTOR

0 comentarios
An insulator can now be transformed to conduct electricity by an ordinary camera flash. A Northwestern University professor and his students have found a new way of turning graphite oxide -- a low-cost insulator made by oxidizing graphite powder -- into graphene, a hotly studied material that conducts electricity. Scientists believe graphene could be used to produce low-cost carbon-based transparent and flexible electronics.

Previous processes to reduce graphite oxide relied on toxic chemicals or high-temperature treatment. The idea for a simple new process came in a burst of inspiration: Can a camera flash instantly heat up the graphite oxide and turn it into graphene?

The process, invented by Jiaxing Huang, assistant professor of materials science and engineering at Northwestern's McCormick School of Engineering and Applied Science, and his graduate student Laura J. Cote and postdoctoral fellow Rodolfo Cruz-Silva, was published in the Aug. 12 issue of the Journal of the American Chemical Society.

Materials scientists previously have used high-temperature heating or chemical reduction to produce graphene from graphite oxide. But these techniques could be problematic when graphite oxide is mixed with something else, such as a polymer, because the polymer component may not survive the high-temperature treatment or could block the reducing chemical from reacting with graphite oxide.

In Huang's flash reduction process, researchers simply hold a consumer camera flash over the graphite oxide and, a flash later, the material is now a piece of fluffy graphene.

"The light pulse offers very efficient heating through the photothermal process, which is rapid, energy efficient and chemical-free," he says.

When using a light pulse, photothermal heating not only reduces the graphite oxide, it also fuses the insulating polymer with the graphene sheets, resulting in a welded conducting composite.

Using patterns printed on a simple overhead transparency film as a photo-mask, flash reduction creates patterned graphene films. This process creates electronically conducting patterns on the insulating graphite oxide film -- essentially a flexible circuit.

The research group hopes to next create smaller circuits on a single graphite-oxide sheet at the single-atom layer level. (The current process has been performed only on thicker films.)

"If we can make a nano circuit on a single piece of graphite oxide," Huang says, "it will hold great promise for patterning electronic devices."

Northwestern University

YTTERBIUM GAINS GROUND IN QUEST FOR NEXT-GENERATION ATOMIC CLOCKS

0 comentarios

An experimental atomic clock based on ytterbium atoms is about four times more accurate than it was several years ago, giving it a precision comparable to that of the NIST-F1 cesium fountain clock, the nation's civilian time standard, scientists at the National Institute of Standards and Technology (NIST) report in Physical Review Letters.

NIST scientists evaluated the clock by measuring the natural frequency of ytterbium, carefully accounting for all possible deviations such as those caused by collisions between the atoms, and by using NIST-F1 as a "ruler" for comparison. The results were good enough to indicate that the ytterbium clock is competitive in some respects with NIST-F1, which has been improving steadily and now keeps time to within 1 second in about 100 million years. (Because the international definition of the second is based on the cesium atom, technically no clock can be more accurate than cesium standards such as NIST-F1.) More importantly, the improved ytterbium clock gives the time standards community more options in the ongoing development and comparisons of next-generation clocks, says NIST physicist Chris Oates, an author of the new paper.

The NIST ytterbium clock is based on about 30,000 heavy metal atoms that are cooled to 15 microkelvins (close to absolute zero) and trapped in a column of several hundred pancake-shaped wells—an "optical lattice"—made of laser light. A laser that "ticks" 518 trillion times per second induces a transition between two energy levels in the atoms. The clock's enhanced performance was made possible by improvements in the apparatus and a switch to a different form of ytterbium whose nucleus is slightly magnetic due its "spin-one half" angular momentum. This atom is less susceptible to key errors than the "spin-zero" form of ytterbium used previously.

NIST scientists are developing five versions of next-generation atomic clocks, each using a different atom and offering different advantages. The experimental clocks all operate at optical (visible light) frequencies, which are higher than the microwave frequencies used in NIST-F1, and thus can divide time into smaller units, thereby yielding more stable clocks. Additionally, optical clocks could one day lead to time standards up to 100 times more accurate than today's microwave clocks.

The best optical clocks are currently based on single ions (electrically charged atoms), such as the NIST "logic clock" using an aluminum ion (see "NIST 'Quantum Logic Clock' Rivals Mercury Ion as World's Most Accurate Clock".) But lattice clocks have the potential for higher stability because they simultaneously average signals from tens of thousands of atoms. Ongoing comparisons of the ytterbium clock to that of the strontium lattice clock located nearby at JILA, a joint institute of NIST and the University of Colorado at Boulder, (see "Collaboration Helps Make JILA Strontium Atomic Clock 'Best in Class'") should help enable worldwide tests of optical clock performance with extremely high precision. JILA is At this point it is far from clear which atom and clock design will be selected by research groups around the world as a future time and frequency standard.

Advances in atomic clock performance support development of technologies such as high data rate telecommunications and the Global Positioning System (GPS). Optical clocks are already providing record measurements of possible changes in the fundamental "constants" of nature, a line of inquiry that has huge implications for cosmology and tests of the laws of physics, such as Einstein's theories of special and general relativity. Next-generation clocks might lead to new types of gravity sensors for exploring underground natural resources and fundamental studies of the Earth. Other possible applications may include ultra-precise autonomous navigation, such as landing planes by GPS.

(Photo: Barber, NIST)

National Institute of Standards and Technology

PARASITE CAUSES ZOMBIE ANTS TO DIE IN AN IDEAL SPOT

0 comentarios
A study in the September issue of The American Naturalist describes new details about a fungal parasite that coerces ants into dying in just the right spot—one that is ideal for the fungus to grow and reproduce. The study, led David P. Hughes of Harvard University, shows just how precisely the fungus manipulates the behavior of its hapless hosts.

When a carpenter ant is infected by a fungus known as Ophiocordyceps unilateralis, the victim remains alive for a short time. The fungus, however, is firmly in the driver's seat. It compels the ant to climb from its nest high in the forest canopy down into small plants and saplings in the understory vegetation. The ant then climbs out onto the underside of a low-hanging leaf where it clamps down with its mandibles just before it dies. There it remains, stuck fast for weeks.

After the ant dies, the fungus continues to grow inside the body. After a few days, a stroma—the fungus's fruiting body—sprouts from the back of the ant's head. After a week or two, the stroma starts raining down spores to the forest floor below. Each spore has the potential to infect another unfortunate passerby.

Scientists have known for over one hundred years about this parasite's ghastly ability to turn unsuspecting ants into zombies. But Hughes and his colleagues chronicle the amazingly precise control the fungus has over its victim.

At a field site in a Thai forest, Hughes's team found that the infected carpenter ants are almost invariably found clamped onto the undersides of leaves that are 25 centimeters (about 10 inches) from the ground below. What's more, most of the dead ants were found on leaves sprouting from the northwest side of the plant. Interestingly, the researchers found that temperature, humidity and sunlight in these spots are apparently optimal for the fungus to grow and reproduce. When the researchers placed leaves with infected ants at higher locations, or on the forest floor, the parasite failed to develop properly.

"The fungus accurately manipulates the infected ants into dying where the parasite prefers to be, by making the ants travel a long way during the last hours of their lives," Hughes said.

But getting the ant to die in the right spot is only half the battle, as the researchers found when they dissected a few victims.

"The fungus has evolved a suite of novel strategies to retain possession of its precious resource," said Hughes.

As the fungus spreads within a dead ant's body, it converts the ant's innards into sugars which are used to help the fungus grow. But it leaves the muscles controlling the mandibles intact to make sure the ant keeps its death grip on the leaf. The fungus also preserves the ant's outer shell, growing into cracks and crevices to reinforce weak spots. In doing this, the fungus fashions a protective coating that keeps microbes and other fungi out. At that point, it can safely get down to the business of claiming new victims.

Carpenter ants apparently have few defenses against the fungus. The most important way they avoid infection seems to be staying as far away from victims as possible. That may be part of the reason why these ants make their nests in the forest canopy, high above fungal breeding zones. Carpenter ants also seem to avoid blazing their foraging trails under infected areas. This too might be an adaptive strategy to avoid infection, but more study is needed to confirm it, Hughes says.

The mechanisms and cues the fungus uses to control an ant's behavior remain unknown. "That is another research area we are actively pursuing right now," Hughes says. Whatever the mechanisms, this much is clear: O. unilateralis has evolved highly specialized abilities to get unsuspecting ants to do its bidding.

Chicago Journals

NEW LASER TECHNIQUE MAY HELP FIND SUPERNOVA

0 comentarios
Hafnium is a common metallic element used in nuclear reactors. However, one of its isotopes is hard to find since it is only made when a supernova explodes. This means that if the isotope, called 182Hf, were discovered on Earth, it would prove that a supernova once exploded near our solar system. This has caused physicists around the world to work hard to find the isotope.

Unfortunately, this particular isotope is difficult to distinguish from other atoms - only one in many billions of hafnium atoms is believed to be of the sought-after kind. Researcher Pontus Andersson from the Department of Physics at the University of Gothenburg and colleagues from USA, Germany and Austria have developed a laser technique that can be used to reject irrelevant atoms, and therefore isolate the unique 182Hf.

In technical terms, their new technique concerns negative ions, which are atoms or molecules with one extra electron. By using laser to detach the extra electron and at the same time register the level of energy needed to do this, it is known that the strength of the bond between the extra electron and the rest of the atom or molecule varies among different substances.

This means that by choosing a certain wavelength of the laser light, they can detach the extra electron from some elements while ions of other elements remain intact. Consequently, if 182Hf exists on Earth, then Andersson and his colleagues should be able to find it, simply by using laser light to remove sufficient amount of other, more common, interfering atoms, to allow detection of 182Hf by conventional methods.

The new technique is a product of advanced atomic physics experiments conducted together with Stockholm University, The VERA institute in Vienna, Austria and Oak Ridge National Lab in USA.

'Our goal is to develop a method that can be of aid when searching for very unusual isotopes. In many cases the standard methods used are hampered by other, interfering atoms. The technique is still in its infancy, but we have shown that our laser beam can remove 99.99 % of the interfering ions in a beam without destroying the ions we are looking for', says Andersson.

University of Gothenburg

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com