Thursday, October 14, 2010

QUANTUM SIGNALS CONVERTED TO TELECOMMUNICATIONS WAVELENGTHS

0 comentarios

Using optically dense, ultra-cold clouds of rubidium atoms, researchers have made advances in three key elements needed for quantum information systems -- including a technique for converting photons carrying quantum data to wavelengths that can be transmitted long distances on optical fiber telecom networks.

The developments move quantum information networks -- which securely encode information by entangling photons and atoms -- closer to a possible prototype system.

Researchers at the Georgia Institute of Technology reported the findings Sept. 26 in the journal Nature Physics, and in a manuscript submitted for publication in the journal Physical Review Letters. The research was sponsored by the Air Force Office of Scientific Research, the Office of Naval Research and the National Science Foundation.

The advances include:

• Development of an efficient, low-noise system for converting photons carrying quantum information at infrared wavelengths to longer wavelengths suitable for transmission on conventional telecommunications systems. The researchers have demonstrated that the system, believed to be the first of its kind, maintains the entangled information during conversion to telecom wavelengths -- and back down to the original infrared wavelengths.

• A significant improvement in the length of time that a quantum repeater -- which would be necessary to transmit the information -- can maintain the information in memory. The Georgia Tech team reported memory lasting as long as 0.1 second, 30 times longer than previously reported for systems based on cold neutral atoms and approaching the quantum memory goal of at least one second -- long enough to transmit the information to the next node in the network.

• An efficient, low-noise system able to convert photons of telecom wavelengths back to infrared wavelengths. Such a system would be necessary for detecting entangled photons transmitted by a quantum information system.

"This is the first system in which such a long memory time has been integrated with the ability to transmit at telecom wavelengths," said Brian Kennedy, a co-author of the Nature Physics paper and a professor in the Georgia Tech School of Physics. "We now have the crucial aspects needed for a quantum repeater."

The conversion technique addresses a long-standing issue facing quantum networks: the wavelengths most useful for creating quantum memory aren't the best for transmitting that information across optical telecommunications networks. Wavelengths of approximately 1.3 microns can be transmitted in optical fiber with the lowest absorption, but the ideal wavelength for storage is 795 nanometers.

The wavelength conversion takes place in a sophisticated system that uses a cloud of rubidium atoms packed closely together in gaseous form to maximize the likelihood of interaction with photons entering the samples. Two separate laser beams excite the rubidium atoms, which are held in a cigar-shaped magneto-optical trap about six millimeters long. The setup creates a four-wave mixing process that changes the wavelength of photons entering it.

"One photon of infrared light going in becomes one photon of telecom light going out," said Alex Kuzmich, an associate professor in the Georgia Tech School of Physics and another of the Nature Physics paper's co-authors. "To preserve the quantum entanglement, our conversion is done at very high efficiency and with low noise."

By changing the shape, size and density of the rubidium cloud, the researchers have been able to boost efficiency as high as 65 percent. "We learned that the efficiency of the system scales up rather quickly with the size of the trap and the number of atoms," Kuzmich said. "We spent a lot of time to make a really dense optical sample. That dramatically improved the efficiency and was a big factor in making this work."

The four-wave mixing process does not add noise to the signal, which allows the system to maintain the information encoded onto photons by the quantum memory. "There are multiple parameters that affect this process, and we had to work hard to find the optimal set," noted Alexander Radnaev, another co-author of the Nature Physics paper.

Once the photons are converted to telecom wavelengths, they move through optical fiber -- and loop back into the magneto-optical trap. They are then converted back to infrared wavelengths for testing to verify that the entanglement has been maintained. That second conversion turns the rubidium cloud into a photon detector that is both efficient and low in noise, Kuzmich said.

Quantum memory is created when laser light is directed into a cloud of rubidium atoms confined in an optical lattice. The energy excites the atoms, and the photons scattered from the atoms carry information about that excitation. In the new Georgia Tech system, these photons carrying quantum information are then fed into the wavelength conversion system.

The research team took two different approaches to extending the quantum memory lifetime, both of which sought to mix the two levels of atoms involved in encoding the quantum information. One approach, described in the Nature Physics paper, used an optical lattice and a two-photon process. The second approach, described in the Physical Review Letters submission, used a magnetic field approach pioneered by researchers at the National Institute of Standards and Technology.

The general purpose of quantum networking is to distribute entangled qubits -- two correlated data bits that are either "0" or "1" -- over long distances. The qubits would travel as photons across existing optical networks that are part of the existing global telecommunications system.

Because of loss in the optical fiber that makes up these networks, repeaters must be installed at regular intervals to boost the signals. For carrying qubits, these repeaters will need quantum memory to receive the photonic signal, store it briefly, and then produce another signal that will carry the data to the next node, and on to its final destination.

"This is another significant step toward improving quantum information systems based on neutral atoms," Kuzmich said. "For quantum repeaters, most of the basic steps have now been made, but achieving the final benchmarks required for an operating system will require intensive optical engineering efforts."

(Photo: GIT)

Georgia Institute of Technology

POTENTIALLY HABITABLE PLANET DISCOVERED

0 comentarios

Astronomers have found a new, potentially habitable Earth-sized planet. It is one of two new planets discovered around the star Gliese 581, some 20 light years away. The planet, Gliese 581g, is located in a “habitable zone”—a distance from the star where the planet receives just the right amount of stellar energy to maintain liquid water at or near the planet’s surface. The 11- year study, published in the Astrophysical Journal and posted online at arXiv.org, suggests that the fraction of stars in the Milky Way harboring potentially habitable planets could be greater than previously thought—as much as a few tens of percent.

The new study brings the total number of planets around Gliese 581 to six and, like our own solar system, they orbit their star in nearly circular orbits. The scientists, members of the Lick-Carnegie Exoplanet Survey, collected 11 years of radial velocity data on the star. The radial velocity method looks at a star’s tiny movements in response to the gravitational tug from orbiting bodies. The team tracked the motion of the planets to a precision of about 1.6 meters per second.

The amplitude and phasing of the star’s subtle gravitational reactions allow researchers to determine a planet’s mass and orbital period. The planet’s radius is estimated by making assumptions about its composition, and its surface gravity is calculated from its mass and radius. Astronomers can also determine the planet’s equilibrium and surface temperatures, which help to determine the potential for habitability. Equilibrium temperature reflects the balance between the energy emitted from the planet and the thermal energy received from the star. The surface temperature is estimated by the planet’s distance from the star and a range of guesses about the composition of its atmosphere. To be habitable, the temperatures must not be too hot, which would vaporize water, nor too cold.

“Our calculations indicate that the planet is between 3.1 and 4.3 Earth masses, has a circular 36.6-day orbit, and a radius estimated between 1.2 and 1.5 Earth radii,” remarked co-author Paul Butler of Carnegie’s Department of Terrestrial Magnetism.

Its semi-major axis—half the length through the long direction of its elliptical orbital path—is 0.146 astronomical units (one AU is the distance between the Earth and the Sun), and its surface gravity is similar to Earth’s at 1.1 to 1.7 g.

Habitability depends on many factors, not just the temperature. The gravity has to be strong enough to hold an atmosphere, for instance, and the temperature must be lower than about 26° F somewhere on the planet. The researchers estimate that the surface temperature of the newly discovered planet is between -24° F and 10° F. The surface would be blazing hot on the side facing the star and freezing cold on the dark side. The planet might be tidally locked to the star—with one side always facing the star, and the other side always dark and cold. This serves to stabilize the planet’s surface climates, according to Steven Vogt, co-author and professor of astronomy and astrophysics at UC Santa Cruz. The most habitable zone on the planet’s surface would be along the line between shadow and light, with surface temperatures decreasing toward the dark side and increasing toward the light side.

Temperatures on Earth vary tremendously, and life can thrive in very extreme environments, ranging from Antarctica, where the temperature can get to -94 ° F, to extremely hot hydrothermal vents, which roil at 235 ° F.

The fact that the researchers were able to detect this planet so quickly and so nearby (in astronomical terms) suggests that habitable planets could be quite common.

(Photo: Lynette Cook)

Carnegie Institution

A LINK BETWEEN DEMENTIA, HIGH BLOOD PRESSURE AND BLOOD FLOW IN THE BRAIN?

0 comentarios

Blood flow through the brain is essential for the delivery of nutrients such as glucose and oxygen that are needed for nerve cells to function. During the early stages of Alzheimer's disease (AD) patients can suffer from high blood pressure and blood flow through the brain is reduced: the greater the reduction, the worse patients' dementia becomes.

A new study will look at the relationship between dementia and high blood pressure, and how blood flow is regulated in the brain. The findings may help researchers identify if some drugs already used for other human conditions may be useful for the treatment of diseases such as stroke and Alzheimer's disease (AD).

Academics at Bristol University's Dementia Research Group, based at Frenchay Hospital, have been awarded a grant of over £266,000 from the British Heart Foundation (BHF) to assess whether drugs that block a small naturally produced molecule called endothelin-1 can improve blood flow through the brain.

In animal models of AD a reduction in blood flow occurs well before the onset of Alzheimer-like damaging changes to brain tissue. The most potent cause of the narrowing of blood vessels is a small molecule called endothelin-1 (ET-1). This molecule is produced by the action of endothelin-converting enzymes (ECEs).

The Bristol-based academics recently found that ECE-2 in the brain of AD patients was abnormally high. One of the hallmarks of AD is the large amount of amyloid β, a toxic molecule which accumulates in the brain of AD patients. The Bristol-based researchers have shown that ECE-2 production increased when nerve cells were exposed to amyloid β, long before people start to display the memory problems recognised in AD. The academics therefore suggested that an important cause of reduced blood flow through the brain in AD (and to high blood pressure, which has been linked to AD) is likely to be an increase in ET-1, resulting from the stimulatory effect of amyloid β on ECE-2 production.

Seth Love, Professor of Neuropathology, said: "We hope our study will shed light on the role of amyloid β. We know it to be involved in AD but it is produced throughout life and what it does in the normal brain has long been a mystery.

"In addition, our research could have important implications for blood pressure control in people with hypertension as well as for treatment of diseases such as stroke and dementia where effective treatments remain limited. Drugs that block ET-1 are already licensed for the treatment of other human diseases and could be used to treat people who have elevated levels of amyloid β and increased ECE-2 activity, whether in the context of established AD or stroke, or at an earlier stage prior to the development of irreversible brain damage."

Professor Jeremy Pearson, Associate Medical Director at the BHF, added: "Thanks to the generous donations of our supporters in Bristol we're able to fund vital research to fight diseases of the heart and circulation. This latest grant joins our portfolio of world-leading research to improve prevention, diagnosis, treatment and care of heart diseases."

(Photo: Bristol U.)

Bristol University

ONE-DIMENSIONAL WINDOW ON SUPERCONDUCTIVITY, MAGNETISM

0 comentarios
A Rice University-led team of physicists is reporting the first success in a three-year effort to build a precision simulator for superconductors using a grid of intersecting laser beams and ultracold atomic gas.

The research appears in the journal Nature. Using lithium atoms cooled to within a few billionths of a degree of absolute zero and loaded into optical tubes, the researchers created a precise analog of a one-dimensional superconducting wire.

Because the atoms in the experiment are so cold, they behave according to the same quantum mechanical rules that dictate how electrons behave. That means the lithium atoms can serve as stand-ins for electrons, and by trapping and holding the lithium atoms in beams of light, researchers can observe how electrons would behave in particular types of superconductors and other materials.

"We can tune the spacing and interactions among these ultracold atoms with great precision, so much so that using the atoms to emulate exotic materials like superconductors can teach us some things we couldn't learn by studying the superconductors themselves," said study co-author Randy Hulet, a Rice physicist who's leading a team of physicists at Rice and six other universities under the Defense Advanced Research Projects Agency's (DARPA) Optical Lattice Emulator (OLE) program.

In the Nature study, Hulet, Cornell University physicist Erich Mueller, Rice graduate students and postdoctoral researchers Yean-an Liao, Sophie Rittner, Tobias Paprotta, Wenhui Li and Gutherie Partridge and Cornell graduate student Stefan Baur created an emulator that allowed them to simultaneously examine superconductivity and magnetism -- phenomena that do not generally coexist.

Superconductivity occurs when electrons flow in a material without the friction that causes electrical resistance. Superconductivity usually happens at very low temperatures when pairs of electrons join together in a dance that lets them avoid the subatomic bumps that cause friction.

Magnetism derives from one of the basic properties of all electrons -- the fact that they rotate around their own axis. This property, which is called "spin," is inherent; like the color of someone's eyes, it never changes. Electron spin also comes in only two orientations, up or down, and magnetic materials are those where the number of electrons with up spins differs from the number with down spins, leaving a "net magnetic moment."

"Generally, magnetism destroys superconductivity because changing the relative number of up and down spins disrupts the basic mechanism of superconductivity," Hulet said. "But in 1964, a group of physicists predicted that a magnetic superconductor could be formed under an exotic set of circumstances where a net magnetic moment arose out of a periodic pattern of excess spins and pairs."

Dubbed the "FFLO" state in honor of the theorists who proposed it -- Fulde, Ferrell, Larkin and Ovchinnikov -- this state of matter has defied conclusive experimental observation for 46 years. Hulet said the new study paves the way for direct observation of the FFLO state.

"The evidence that we've gathered meets the criteria of the FFLO state, but we can't say for certain that we have observed it. To do that, we need to precisely measure the distribution of velocities of the pairs to confirm that they follow the FFLO relationship. We're working on that now."

Rice University

STRESS HORMONE BLOCKS TESTOSTERONES EFFECTS, STUDY SHOWS

0 comentarios
High levels of the stress hormone cortisol play a critical role in blocking testosterone's influence on competition and domination, according to new psychology research at The University of Texas at Austin.

The study, led by Robert Josephs, professor of psychology at The University of Texas at Austin, and Pranjal Mehta, assistant professor of psychology at the University of Oregon, is the first to show that two hormones—testosterone and cortisol—jointly regulate dominance.

The findings, available online in Hormones and Behavior, show that when cortisol—a hormone released in the body in response to threat—increases, the body is mobilized to escape danger, rather than respond to any influence that testosterone is having on behavior.

The study provides new evidence that hormonal axes (complex feedback networks between hormones and particular brain areas that regulate testosterone levels and cortisol) work against each other to regulate dominant and competitive behaviors.

"It makes good adaptive sense that testosterone's behavioral influence during an emergency situation gets blocked because engaging in behaviors that are encouraged by testosterone, such as mating, competition and aggression, during an imminent survival situation could be fatal," Josephs said. "On the other hand, fight or flight behaviors encouraged by cortisol become more likely during an emergency situation when cortisol levels are high. Thus, it makes sense that the hormonal axes that regulate testosterone levels and cortisol levels are antagonistic."

As part of the study, the researchers measured hormone levels of saliva samples provided by 57 subjects. The respondents participated in a one-on-one competition and were given the opportunity to compete again after winning or losing. Among those who lost, 100 percent of the subjects with high testosterone and low cortisol requested a rematch to recapture their lost status. However, 100 percent of participants with high testosterone and high cortisol declined to compete again. All subjects who declined a rematch experienced a significant drop in testosterone after defeat, which may help to explain their unwillingness to compete again, Josephs said.

The researchers suggest these findings reveal new insights into the physiological effects of stress and how they may play a role in fertility problems. According to research, chronically elevated cortisol levels can produce impotence and loss of libido by inhibiting testosterone production in men. In women, chronically high levels of cortisol can produce severe fertility problems and result in an abnormal menstrual cycle.

"When cortisol levels remain elevated, as is the case with so many people who are under constant stress, the ability to reproduce can suffer greatly," Josephs said. "However, these effects of cortisol in both men and women are reversed when stress levels go down."

The University of Texas at Austin

"CORESHINE" SHEDS LIGHT ON THE BIRTH OF STARS

0 comentarios

Science is literally in the dark when it comes to the birth of stars, which occurs deep inside clouds of gas and dust. These clouds are completely opaque to ordinary light. Now, a group of astronomers has discovered a new astronomical phenomenon that appears to be common in such clouds, and promises a new window onto the earliest phases of star formation. The phenomenon - infra red light that is scattered by unexpectedly large grains of dust, which the astronomers have termed "coreshine" - probes the dense cores where stars are born.

Stars are formed as the dense core regions of cosmic clouds of gas and dust ("molecular clouds") collapse under their own gravity. As a result, matter in these regions becomes ever denser and hotter until finally nuclear fusion is ignited: a star is born. This is how our own star, the Sun, came into being; the fusion processes are responsible for the Sun’s light, on which life on Earth depends. The dust grains contained in the collapsing clouds are the raw material out of which an interesting by-product of star formation is made: solar systems and Earth-like planets.

What happens during the earliest phases of this collapse is largely unknown. Enter an international team of astronomers led by Laurent Pagani (LERMA, Observatoire de Paris) and Jürgen Steinacker (Max Planck Institute for Astronomy, Heidelberg, Germany), who have discovered a new phenomenon which promises information about the crucial earliest phase of the formation of stars and planets: "coreshine", the scattering of mid-infrared light (which is ubiquitous in our galaxy) by dust grains inside such dense clouds. The scattered light carries information about the size and density of the dust particles, about the age of the core region, the spatial distribution of the gas, the prehistory of the material that will end up in planets, and about chemical processes in the interior of the cloud.

The discovery is based on observations with NASA’s SPITZER Space Telescope. As published this February, Steinacker, Pagani and colleagues from Grenoble and Pasadena detected unexpected mid-infrared radiation from the molecular cloud L 183 in the constellation Serpens Cauda ("Head of the snake"), at a distance of 360 light-years. The radiation appeared to originate in the cloud’s dense core. Comparing their measurements with detailed simulations, the astronomers were able to show that they were dealing with light scattered by dust particles with diameters of around 1 micrometre (one millionth of a metre). The follow-up research that is now being published in Science clinched the case: The researchers examined 110 molecular clouds at distances between 300 and 1300 light-years, which had been observed with Spitzer in the course of several survey programs. The analysis showed that the L 183 radiation was more than a fluke. Instead, it revealed that coreshine is a widespread astronomical phenomenon: Roughly half of the cloud cores exhibited coreshine, mid-infrared radiation associated with scattering from dust grains in their densest regions.

The discovery of coreshine suggests a host of follow-on projects - for the SPITZER Space Telescope as well as for the James Webb Space Telescope, which is due to be launched in 2014. The first coreshine observations have yielded promising results: The unexpected presence of larger grains of dust (diameters of around a millionth of a metre) shows that these grains begin their growth even before cloud collapse commences. An observation of particular interest concerns clouds in the Southern constellation Vela, in which no coreshine is present. It is known that this region was disturbed by several stellar (supernova) explosions. Steinacker and his colleagues hypothesize that these explosions have destroyed whatever larger dust grains had been present in this region.

(Photo: MPIA)

Max Planck Institute

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com