Thursday, July 23, 2009

LIVING FOSSILS HOLD RECORD OF SUPERMASSIVE KICK

0 comentarios

The tight cluster of stars surrounding a supermassive black hole after it has been violently kicked out of a galaxy represents a new kind of astronomical object and a fossil record of the kick.

A paper published in the July 10 issue of The Astrophysical Journal discusses the theoretical properties of “hypercompact stellar systems” and suggests that hundreds of these faint star clusters might be detected at optical wavelengths in our immediate cosmic environment. Some of these objects may already have been picked up in astronomical surveys, reports David Merritt, from Rochester Institute of Technology, Jeremy Schnittman, from Johns Hopkins University, and Stefanie Komossa, from the Max-Planck-Institut for Extraterrestrial Physics in Germany.

Hypercompact stellar systems result when a supermassive black hole is violently ejected from a galaxy, following a merger with another supermassive black hole. The evicted black hole rips stars from the galaxy as it is thrown out. The stars closest to the black hole move in tandem with the massive object and become a permanent record of the velocity at which the kick occurred.

“You can measure how big the kick was by measuring how fast the stars are moving around the black hole,” says Merritt, professor of physics at RIT. “Only stars orbiting faster than the kick velocity remain attached to the black hole after the kick. These stars carry with them a kind of fossil record of the kick, even after the black hole has slowed down. In principle, you can reconstruct the properties of the kick, which is nice because there would be no other way to do it.”

“Finding these objects would be like discovering DNA from a long-extinct species,” adds Komossa.

The best place to find hypercompact stellar systems, the authors argued, is in cluster of galaxies like the nearby Coma and Virgo clusters. These dense regions of space contain thousands of galaxies that have been merging for a long time. Merging galaxies result in merging black holes, which is a prerequisite for the kicks.

“Even if the black hole gets kicked out of one galaxy, it’s still going to be gravitationally bound to the whole cluster of galaxies,” Merritt says. “The total gravity of all the galaxies is acting on that black hole. If it was ever produced, it’s still going to be there somewhere in that cluster.”

Merritt and his co-authors think that scientists may have already seen hypercompact stellar systems and not realized it. These objects would be easy to mistake for common star systems like globular clusters. The key signature making hypercompact stellar systems unique is a high internal velocity. This is detectable only by measuring the velocities of stars moving around the black hole, a difficult measurement that would require a long time exposure on a large telescope.

From time to time, a hypercompact stellar system will make its presence known in a much more dramatic way, when one of the stars is tidally disrupted by the supermassive black hole. In this case, gravity stretches the star and sucks it into the black hole. The star is torn apart, causing a beacon-like flare that signals a black hole. The possibility of detecting one of these “recoil flares” was first discussed in an August 2008 paper by co-authors Merritt and Komossa.

“The only contact of these floating black holes with the rest of the universe is through their armada of stars,” Merritt says, “with an occasional display of stellar fireworks to signal ‘here we are.’”

(Photo: Space Telescope Science Institute)

Rochester Institute of Technology

ROBOT LEARNS TO SMILE AND FROWN

0 comentarios

A hyper-realistic Einstein robot at the University of California, San Diego has learned to smile and make facial expressions through a process of self-guided learning. The UC San Diego researchers used machine learning to “empower” their robot to learn to make realistic facial expressions.

“As far as we know, no other research group has used machine learning to teach a robot to make realistic facial expressions,” said Tingfan Wu, the computer science Ph.D. student from the UC San Diego Jacobs School of Engineering who presented this advance on June 6 at the IEEE International Conference on Development and Learning.

The faces of robots are increasingly realistic and the number of artificial muscles that controls them is rising. In light of this trend, UC San Diego researchers from the Machine Perception Laboratory are studying the face and head of their robotic Einstein in order to find ways to automate the process of teaching robots to make lifelike facial expressions.

This Einstein robot head has about 30 facial muscles, each moved by a tiny servo motor connected to the muscle by a string. Today, a highly trained person must manually set up these kinds of realistic robots so that the servos pull in the right combinations to make specific face expressions. In order to begin to automate this process, the UCSD researchers looked to both developmental psychology and machine learning.

Developmental psychologists speculate that infants learn to control their bodies through systematic exploratory movements, including babbling to learn to speak. Initially, these movements appear to be executed in a random manner as infants learn to control their bodies and reach for objects.

“We applied this same idea to the problem of a robot learning to make realistic facial expressions,” said Javier Movellan, the senior author on the paper presented at ICDL 2009 and the director of UCSD’s Machine Perception Laboratory, housed in Calit2, the California Institute for Telecommunications and Information Technology.

Although their preliminary results are promising, the researchers note that some of the learned facial expressions are still awkward. One potential explanation is that their model may be too simple to describe the coupled interactions between facial muscles and skin.

To begin the learning process, the UC San Diego researchers directed the Einstein robot head (Hanson Robotics’ Einstein Head) to twist and turn its face in all directions, a process called “body babbling.” During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot learned to make facial expressions it had never encountered.

For example, the robot learned eyebrow narrowing, which requires the inner eyebrows to move together and the upper eyelids to close a bit to narrow the eye aperture.

“During the experiment, one of the servos burned out due to misconfiguration. We therefore ran the experiment without that servo. We discovered that the model learned to automatically compensate for the missing servo by activating a combination of nearby servos,” the authors wrote in the paper presented at the 2009 IEEE International Conference on Development and Learning.

“Currently, we are working on a more accurate facial expression generation model as well as systematic way to explore the model space efficiently,” said Wu, the computer science PhD student. Wu also noted that the “body babbling” approach he and his colleagues described in their paper may not be the most efficient way to explore the model of the face.

While the primary goal of this work was to solve the engineering problem of how to approximate the appearance of human facial muscle movements with motors, the researchers say this kind of work could also lead to insights into how humans learn and develop facial expressions.

(Photo: UCSD Jacob)

University of California, San Diego, Jacobs School of Engineering

2,000-YEAR-OLD STATUE OF AN ATHLETE SHEDS LIGHT ON CORROSION AND OTHER MODERN CHALLENGES

0 comentarios

The restoration of a 2,000-year-old bronze sculpture of the famed ancient Greek athlete Apoxyomenos may help modern scientists understand how to prevent metal corrosion, discover the safest ways to permanently store nuclear waste, and understand other perplexing problems. That's the conclusion of a new study on the so-called "biomineralization" of Apoxyomenos appearing in the current issue of ACS' Crystal Growth & Design, a bi-monthly journal. Best known as "The Scraper," the statue depicts an athlete scraping sweat and dust from his body with a small curved instrument.

In the report, Davorin Medakovic and colleagues point out that Apoxyomenos was discovered in 1998 on floor of the Adriatic Sea. While the discovery was a bonanza for archaeologists and art historians, it also proved to be an unexpected boon to scientists trying to understand biomineralization. That's the process in which animals and plants use minerals from their surroundings and form shells and bone. Apoxyomenos was encrusted with such deposits.

"As studies of long-term biofouled manmade structures are limited, the finding of an ancient sculpture immersed for two millennia in the sea provided a unique opportunity to probe the long-term impact of a specific artificial substrate on biomineralizng organisms and the effects of biocorrosion," the report said. By evaluating the mineral layers and fossilized organisms on the statue, the researchers were able to evaluate how underwater fouling organisms and communities interacted with the statue as well as how certain mineral deposits on the bronze sculpture slowed its deterioration.

(Photo: The American Chemical Society)

The American Chemical Society

PLAYING IT SAFE

0 comentarios

Kinarm Ko and Hans Schöler’s team at the Max Planck Institute for Molecular Biomedicine in Münster have succeeded for the first time in culturing a clearly defined cell type from the testis of adult mice and converting these cells into pluripotent stem cells without introduced genes, viruses or reprogramming proteins. These stem cells have the capacity to generate all types of body tissue. The culture conditions alone were the crucial factor behind the success of the reprogramming process.

The testis is a sensitive organ and an astonishing one at that. Even at the age of 70, 80 or 85, men have cells that constantly produce new sperm. Therefore, they can conceive embryos and become fathers at almost any age - assuming they can find a sufficiently young female partner. Based on this, researchers have long assumed that cells from the testis have a similar potential as in embryonic stem cells: that is, a pluripotency that enables them to form over 200 of the body’s cell types.

In fact, a number of researchers have recently stumbled on the multiple talents in the male gonads of humans and mice. It all began with the work of Takashi Shinohara’s team in 2004. The Japanese scientists discovered that, like embryonic stem cells, certain cells in the testis of newborn mice are able to develop into different kinds of tissue. In 2006, scientists working with Gerd Hasenfuß and Wolfgang Engel in Göttigen reported that such adaptable cells can also be found in adult male mice. Additionally, Thomas Skutella and his colleagues at the University of Tübingen recently made headlines when they cultured comparable cells from human testis tissue.

"At first glance, it would appear that it has long been established that pluripotent cells exist in the testis of adult humans and mice," says Schöler. "However, it is often unclear as to exactly which cells are being referred to in the literature and what these cells can actually do." (See *Background Information)

This is not only due to the fact that the testis contains a multitude of different cells. Scientists who dismantle tissue in the laboratory must carefully separate and analyse the cells to establish which cell type they have under the microscope. The question of potency is a controversial one among stem cell researchers, as binding benchmarks have yet to be defined. What some scientists would define as "pluripotent" is just about deemed "multi-potent", that is, as having a limited capacity for differentiation, by others. Greater certainty can be provided by carrying out the relevant tests. These include, among other things, a test to establish whether, after injection into early embryos, the cells are able to contribute to the development of the new organism and gamete formation, and to pass on their genes to further generations. However, not every team carries out all of these tests and important questions are left unanswered, even in articles published in renowned journals.

With their work, Ko and his colleagues wanted to establish clarity from the outset. To this end, they started by culturing a precisely defined type of cell, so-called germline stem cells (GSCs), from the testis of adult mice. In their natural environment, these cells can only do one thing: constantly generate new sperm. Moreover, their own reproduction is an extremely rare occurrence. Only two or three of them will be found among the 10,000 cells in the testis tissue of a mouse. However, they can be isolated individually and reproduced as cell lines with stable characteristics. Under the usual cell culturing conditions, they retain their unipotency for weeks and years. Consequently, all they can do is reproduce or form sperm.

What nobody had guessed until now, however, was that a simple trick is enough to incite these cells to reprogramme. If the cells are distributed on new petri dishes, some of them revert to an embryonic state once they are given sufficient space and time. "Each time we filled around 8000 cells into the individual wells of the cell culture plates, some of the cells reprogrammed themselves after two weeks," reports Ko. And when the switch in these germline-derived pluripotent stem cells (gPS) has been reversed, they start to reproduce rapidly.

The researchers have proven that the "reignition" of the cells has actually taken place with the aid of numerous tests. Not only can the reprogrammed cells be used to generate heart, nerve or endothelial cells, as is the case with embryonic stem cells, the scientists can also use them to produce mice with mixed genotypes, known as chimeras, from the new gPs, and thus demonstrate that cells obtained from the testis can pass their genes on to the next generation.

Whether this process can also be applied to humans remains an open question. There is much to suggest, however, that gPS cells exceed all previously artificially reprogrammed cells in terms of the simplicity of their production and their safety.

(Photo: MPI Münster / Kinarm Ko)

Max Planck Society

NEW NCAR SYSTEM MAY GUIDE TRANSOCEANIC FLIGHTS AROUND STORMS AND TURBULENCE

0 comentarios

The National Center for Atmospheric Research is developing a prototype system to provide aircraft with updates about severe storms and turbulence as they fly across remote ocean regions. The system is designed to help guide pilots away from intense weather, such as the thunderstorms that Air France Flight 447 apparently encountered before crashing into the Atlantic Ocean on June 1.

The NCAR system, being developed with funding from NASA, combines satellite data and computer weather models with cutting-edge artificial intelligence techniques to identify and predict rapidly evolving storms and other potential areas of turbulence. The system is based on products that NCAR has developed to alert pilots and air traffic controllers about storms and turbulence over the continental United States.

"Pilots currently have little weather information as they fly over remote stretches of the ocean, which is where some of the worst turbulence encounters occur," says NCAR scientist John Williams, one of the project leads. "Providing pilots with at least an approximate picture of developing storms could help guide them safely around areas of potentially severe turbulence."

The component of the system that identifies major storms over the ocean is already available for aircraft use on an experimental basis.

The entire prototype system, which will identify areas of turbulence in clear air as well as within storms, is on track for testing next year. Pilots on selected transoceanic routes will receive real-time turbulence updates and then provide feedback on the system to NCAR. The researchers will adjust the system as needed.

When the system is finalized in about two years, it will provide pilots and ground-based controllers with text-based maps and graphical displays showing likely regions of turbulence and of storms.

In addition to NCAR, other organizations taking part in the research include the Massachusetts Institute of Technology's Lincoln Laboratory, the Naval Research Laboratory, and the University of Wisconsin-Madison.

Pinpointing turbulence over the oceans is far more challenging than over land because of sparse observations. Weather satellites are often the only source of information over these remote regions. But the satellites provide images less frequently in general than over land, which can make it difficult to capture fast-changing conditions, and they do not directly measure turbulence.

Pilots of transoceanic flights currently get preflight briefings and, in certain cases involving especially intense storms, in-flight weather updates every four hours. They also have onboard radar.

All this information, however, is of limited value. Thunderstorms may develop quickly and move rapidly, rendering the briefings and weather updates obsolete. Onboard radars are designed to detect clouds and precipitation, but turbulence is often located far from the most intense precipitation. As a result, pilots often must choose between detouring hundreds of miles around potentially stormy areas or taking a chance and flying directly through a region that may or may not contain intense weather.

In contrast, NCAR provides real-time maps of turbulence at various altitudes over the continental United States. Such a system, had it encompassed remote ocean regions, could have alerted the pilots of the doomed Air France flight to the stormy conditions along their flight path. The cause of that disaster has not been determined, and it is impossible to know whether the system could have prevented it.

"It seems likely that the information provided by a real-time uplink of weather conditions ahead would have, at a minimum, improved the pilots' situational awareness," Williams says.

Williams and his colleagues have recently completed two critical steps in identifying turbulence over the oceans:

•The team has created global maps of clear air turbulence based on global computer weather models that include winds and other instabilities in the atmosphere. Clear air turbulence consists of erratic movements of air masses that occur in the absence of clouds and that sometimes buffet aircraft.

•Drawing on satellite images of storms, the scientists have created global views of the tops of storm clouds. Higher cloud tops are often correlated with intense storms, although not necessarily with turbulence.

The next step is to identify areas of possible turbulence within and around intense storms. To do so, the team will study correlations between storms and turbulence over the continental United States where weather is more closely observed. The scientists will then infer the likelihood of turbulence associated with storms over the oceans, keying in on satellite indicators such as rapidly expanding clouds or places where the tops of storm clouds are cooling quickly.

They will also develop mathematical equations to account for differences in cloud systems over the United States compared to over the oceans, including the tropics.

In addition to providing aircraft and ground controllers with up-to-the-minute maps of turbulence, the NCAR team is turning to an artificial intelligence technique, known as "random forests," to provide short-term forecasts of turbulence. The random forests, which have proven useful for forecasting thunderstorms over land, consist of many decision trees that each cast a yes-or-no "vote" on crucial elements of a storm at future points in time and space. This enables scientists to forecast the movement and strength of the storm over the next few hours.

"Our goal is to give pilots a regularly updated picture of the likely storms ahead as they fly over the ocean so they can take action to minimize turbulence and keep their aircraft out of danger," explains NCAR scientist Cathy Kessinger, a project team member. "Even over the middle of the ocean, where we don't have land-based radars or other tools to observe storms in detail, we can still inform pilots about the potential for violent downdrafts, turbulence, and possibly lightning."

(Photo: UCAR/Carlye Calvin)

University Corporation for Atmospheric Research

CU-BOULDER, NASA TEST NEW 'SPACE INTERNET' PROTOCOLS ON INTERNATIONAL SPACE STATION

0 comentarios

The University of Colorado at Boulder is working with NASA to develop a new communications technology now being tested on the International Space Station, which will extend Earth's Internet into outer space and across the solar system.

Called Disruption Tolerant Networking, or DTN, the new technology will enable NASA and other space agencies around the world to better communicate with international fleets of spacecraft that will be used to explore the moon and Mars in the future. The technology is expected to lead to a working "Interplanetary Internet," said Kevin Gifford, a senior research associate at CU-Boulder's BioServe Space Technologies and a faculty member in the aerospace engineering sciences department.

"Communication between spacecraft and ground stations has traditionally been over a single point-to-point link, much like a walkie-talkie," said Gifford. "Currently, space operations teams must manually schedule each link and generate appropriate commands to specify where the data is to be sent, the time it will be sent and its destination. As the number of spacecraft and links increase and the need to communicate between many space vehicles emerges, these manual operations become increasingly cumbersome and costly," he said.

"Highly automated future communications capabilities will be required for lunar habitation and surface exploration that include passing information between orbiting relay satellites, lunar and planetary habitats and astronauts on the surface," said Gifford. "But existing Internet protocols, where Internet hosts and computers are always connected, do not work well for many space-based environments, where intermittently connected operations are common."

The new data communications protocols were installed on a BioServe payload known as the Commercial Generic Bioprocessing Apparatus, or CGBA, on the International Space Station in May to send DTN messages known as "bundles," said Gifford. As part of NASA's communication operations test that will begin June 15, bundles will be sent from the space station to its operations and control facility at Marshall Space Flight Center in Huntsville, Ala., then on to a mission control center at CU-Boulder's BioServe.

The new DTN "Bundle Protocol" was developed by the Internet Research Task Force based on initial work started over 10 years ago in a partnership between NASA and Vint Cerf, who holds the title of vice president and chief Internet evangelist of Google Inc. of Mountain View, Calif. Cerf often is referred to as one of the "fathers" of the Internet.

Cerf said that "while conventional Internet protocols may work well in short-delay, richly connected terrestrial environments, they quickly degrade in long-delay and highly stressed wireless data communications scenarios that are already beginning to be encountered at the edges of the Internet, which is where space tends to begin."

Cerf's counterpart in the Space Communications and Navigation office at NASA Headquarters in Washington, D.C., is Adrian Hooke. Hooke, a veteran of the Apollo 11 mission launch team, is the manager of NASA's new Space DTN project and is a pioneer in the development of international space networking standards.

"With the new system, delays caused by spacecraft moving behind planets or solar storms disrupting communications are not a problem because the data packets are not discarded when outages occur, but instead are stored as long as necessary until an opportunity arises that allows them to be transmitted," Hooke said. "This 'store-and forward' method is similar to a basketball player passing the ball down the court to other players nearer to the basket, who have a clear shot at the goal."

"By improving data timeliness associated with robotic and human-tended missions, NASA is reducing risk, reducing cost, increasing crew safety, improving operational awareness and improving science return," said Gifford. "There also are intriguing applications of the DTN technology on Earth. They include the tracking of livestock and wildlife, enhancing Internet 'hot spot' connectivity in remote rural areas in Third World countries, and tactical operations support for the U.S. military."

Multiple NASA centers are involved in the research, including the Marshall Space Flight Center, the Johnson Space Center in Houston, the Glenn Research Center in Cleveland, the Goddard Space Flight Center in Greenbelt, Md., the Jet Propulsion Laboratory in Pasadena, Calif., and the Applied Physics Laboratory at Johns Hopkins University in Laurel, Md. NASA and CU-Boulder also are exploring ways to extend the experiments on the International Space Station to involve the European Space Agency and the Japanese Aerospace Exploration Agency.

In November 2008, JPL first tested the DTN protocols by transmitting dozens of space images through the EPOXI spacecraft -- located about 20 million miles from Earth -- back to NASA's Deep Space Network. "The new series of DTN testing on the International Space Station adds yet another space-based router to the gradually evolving Interplanetary Internet," said Hooke.

BioServe has designed, built and flown over 50 different payloads on over 35 space flight missions including the NASA space shuttle, the space station, Russia's MIR space station and the Russian Soyuz and Progress spacecraft.

(Photo: Glenn Asakawa/University of Colorado)

University of Colorado

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com