Wednesday, September 30, 2009

OUT OF DARKNESS, SIGHT

0 comentarios

Cases of restored vision after a lifetime of blindness, though exceedingly rare, provide a unique opportunity to address several fundamental questions regarding brain function. After being deprived of visual input, the brain needs to learn to make sense of the new flood of visual information. Very little is known about how this learning takes place, but a new study by MIT neuroscientists suggests that dynamic information — that is, input from moving objects — is critical.

In the United States, as in most developed nations, infants with curable blindness are treated within a few weeks of birth. However, in developing nations such as India, there are relatively more instances of children born with curable forms of blindness that are left untreated for want of medical or financial resources. Such children face greatly elevated odds of early mortality, illiteracy and unemployment. Doctors have been hesitant to treat older patients because the conventional dogma holds that the brain is incapable of learning to see after age 5 or 6.

MIT brain and cognitive sciences professor Pawan Sinha, through his humanitarian foundation, Project Prakash (Sanskrit for "light"), has treated and studied several such patients over the past five years. The Prakash effort serves the dual purpose of providing sight to blind children and, in the process, tackling several foundational issues in neuroscience.

The new findings from Sinha's team, reported in the November issue of the Journal of Psychological Science, provide clues about how the brain learns to put together the visual world. They not only support the idea of treating blindness in older children and adults, but also offer insight into modeling the human visual system, diagnosing visual disorders, creating rehabilitation procedures and developing computers that can see.

This work builds on a 2007 study in which Sinha and graduate student Yuri Ostrovsky showed that a woman who had had her sight restored at age 12 had nearly normal visual processing abilities. These findings were significant since they challenged the widely held notion of a "critical age" for acquiring vision.

However, because they came across the woman 20 years after her sight was restored, the researchers had no chance to study how her brain first learned to process visual input. The new work focuses on three adolescent and young adult patients in India, and follows them from the time of treatment to several months afterward. It suggests that "not only is recovery possible, but also provides insights into the mechanism by which such recovery comes about," says Sinha.

Testing the patients within weeks of sight restoration, Sinha and his colleagues found that subjects had very limited ability to distinguish an object from its background, identify overlapping objects, or even piece together the different parts of an object. Eventually, however, they improved in this "visual integration" task, discovering whole objects and segregating them from their backgrounds.

"Somehow our brain is able to solve the problem, and we want to know how it does it or how it learns to do it," says Ostrovsky, lead author of the new paper.

One of their subjects, known as S.K., suffered from a rare condition called secondary congenital aphakia (a lack of lenses in the eye) and was treated with corrective optics in 2004, at the age of 29. After treatment, S.K. participated in a series of tests asking him to identify simple shapes and objects.

S.K. could identify some shapes (triangles, squares, etc.) when they were side-by-side, but not when they overlapped. His brain was unable to distinguish the outlines of a whole shape; instead, he believed that each fragment of a shape was its own whole. For S.K. and other patients like him, "it seems like the world has been broken into many different pieces," says Sinha.

However, if a square or triangle was put into motion, S.K. (and the other two patients) could much more easily identify it. (With motion, their success rates improved from close to zero to around 75 percent.) Furthermore, motility of objects greatly influenced the patients' ability to recognize them in images.

During follow-up tests that continued for 18 months after treatment, the patients' performance with stationary objects gradually improved to almost normal.

These results suggest that movement patterns in the world provide some of the most salient clues about its constituent objects. The brain is programmed to use similarity of dynamics to infer which regions constitute objects, says Sinha. The significance of motion may go even further, the team believes. It may serve to "bootstrap" the learning of rules and heuristics by which the brain comes to be able to parse static images. The idea is simple but far-reaching. Starting from an initial capability of grouping via motion, the brain begins to notice that similar dynamics are correlated with similarity in other region attributes such as orientation and color. These attributes can then be used even in the absence of motion.

In addition to understanding how the human visual system works, the findings could help researchers build robots with visual systems capable of autonomously discovering objects in their environment.

"If we could understand how the brain learns to see, we can better understand how to train a computer to do it," says Ostrovsky.

(Photo: Pawan Sinha)

MIT

EVIDENCE POINTS TO CONSCIOUS 'METACOGNITION' IN SOME NONHUMAN ANIMALS

0 comentarios

J. David Smith, Ph.D., a comparative psychologist at the University at Buffalo who has conducted extensive studies in animal cognition, says there is growing evidence that animals share functional parallels with human conscious metacognition -- that is, they may share humans' ability to reflect upon, monitor or regulate their states of mind.

Smith makes this conclusion in an article published the September issue of the journal Trends in Cognitive Science (Volume 13, Issue 9). He reviews this new and rapidly developing area of comparative inquiry, describing its milestones and its prospects for continued progress.

He says "comparative psychologists have studied the question of whether or not non-human animals have knowledge of their own cognitive states by testing a dolphin, pigeons, rats, monkeys and apes using perception, memory and food-concealment paradigms.

"The field offers growing evidence that some animals have functional parallels to humans' consciousness and to humans' cognitive self-awareness," he says. Among these species are dolphins and macaque monkeys (an Old World monkey species).

Smith recounts the original animal-metacognition experiment with Natua the dolphin. "When uncertain, the dolphin clearly hesitated and wavered between his two possible responses," he says, "but when certain, he swam toward his chosen response so fast that his bow wave would soak the researchers' electronic switches.

"In sharp contrast," he says, "pigeons in several studies have so far not expressed any capacity for metacognition. In addition, several converging studies now show that capuchin monkeys barely express a capacity for metacognition.

"This last result," Smith says, "raises important questions about the emergence of reflective or extended mind in the primate order.

"This research area opens a new window on reflective mind in animals, illuminating its phylogenetic emergence and allowing researchers to trace the antecedents of human consciousness."

Smith, a professor in the UB Department of Psychology and Center for Cognitive Sciences, is recognized for his research and publications in the field of animal cognition.

He and his colleagues pioneered the study of metacognition in nonhuman animals, and they have contributed some of the principal results in this area, including many results that involve the participation of Old World and New World monkeys who have been trained to use joysticks to participate in computer tasks.

Their research is supported by the National Institute of Child Health and Development and the National Science Foundation.

Smith explains that metacognition is a sophisticated human capacity linked to hierarchical structure in the mind (because the metacognitive executive control processes oversee lower-level cognition), to self-awareness (because uncertainty and doubt feel so personal and subjective) and to declarative consciousness (because humans are conscious of their states of knowing and can declare them to others).

Therefore, Smith says, "it is a crucial goal of comparative psychology to establish firmly whether animals share humans' metacognitive capacity. If they do, it could bear on their consciousness and self-awareness, too."

In fact, he concludes, "Metacognition rivals language and tool use in its potential to establish important continuities or discontinuities between human and animal minds."

(Photo: U. Buffalo)

University at Buffalo

DEPRESSION INCREASES CANCER MORTALITY RATE

0 comentarios
A number of studies have shown that individuals’ mental attitudes can impact their physical health. To determine the effects of depression on cancer patients’ disease progression and survival, UBC Dept. of Psychology graduate student Jillian Satin and colleagues analyzed all studies to date that they could identify related to the topic.

The researchers found 26 studies with a total of 9,417 patients that examined the effects of depression on patients’ cancer progression and survival. Their analysis is published online today by the American Cancer Society journal Cancer.

“We found an increased risk of death in patients who report more depressive symptoms than others and also in patients who have been diagnosed with a depressive disorder compared to patients who have not,” said Satin. In the combined studies, the death rates were as much as 25 per cent higher in patients experiencing depressive symptoms and 39 per cent higher in patients diagnosed with major or minor depression.

The increased risks remained even after considering other clinical characteristics that might affect survival, indicating that depression may actually play a part in shortening survival. However, the authors say additional research must be conducted before any conclusions can be reached. The authors add that their analysis combined results across different tumor types, so future studies should look at the effects of depression on specific kinds of cancer.

The investigators note that the actual risk of death associated with depression in cancer patients is still small, so patients should not feel that they must maintain a positive attitude to beat their disease. Nevertheless, the study indicates that it is important for physicians to regularly screen cancer patients for depression and to provide appropriate treatments.

The researchers did not find a clear association between depression and cancer progression, although only three studies were available for analysis.

University of British Columbia

Tuesday, September 29, 2009

SANDIA HOPPING ROBOTS TO BOLSTER TROOP CAPABILITIES

0 comentarios

Boston Dynamics, developer of advanced dynamic robots such as BigDog and PETMAN, has been awarded a contract by Sandia to develop the next generation of the Precision Urban Hopper, meaning Sandia‘s hopping robots may soon be in combat.

When fully operational, the four-wheeled hopper robots will navigate autonomously by wheel and jump – with one mighty leg – onto or over obstacles of more than 25 feet, said Jon Salton, Sandia program manager.

“The Precision Urban Hopper is part of a broad effort to bolster the capabilities of troops and special forces engaged in urban combat, giving them new ways to operate unfettered in the urban canyon,” Salton said.

The development program, funded by Defense Advanced Research Projects Agency, Department of Defense’s advanced technology organization, has a 12-month design phase followed by a six-month build phase, with testing and delivery planned for late 2010.

As part of the ongoing DARPA project, Sandia developed the shoebox-sized, GPS-guided, unmanned ground robots.

The demonstrated hopping capability of the robots allows the small unmanned ground vehicles to overcome as many as 30 obstacles that are 40-60 times their own size. Hopping mobility has been shown to be five times more efficient than hovering when traversing obstacles at heights under 10 meters, which allows longer station-keeping time for the same amount of fuel.

The wheeled robotic platform adapts to the urban environment in real time and provides precision payload deployment to any point of the urban jungle while remaining lightweight and small. Researchers addressed several technical challenges, including appropriate management of shock forces during landing, controlling hop height from varying terrain including concrete, asphalt, sand and vegetation and controlling landings to limit tumbling.

An overall goal of the robots is to decrease the number of casualties in combat. To that end, the hopping robots will provide enhanced situational awareness for shaping the outcome of the immediate local combat situation, Salton said. Their compact, lightweight design makes them portable, and their semiautonomous capability greatly reduces the workload burden of the operator.

In addition to providing military assistance, the hopping capabilities of the robots could be used in law enforcement, homeland security, search and rescue applications in challenging terrain and in planetary exploration, Salton said.

”We are delighted to win this project and get a chance to work with Sandia on such a novel and potentially useful robot,” said Marc Raibert, president and founder of Boston Dynamics. “The program gives us a chance to apply our special brand of advanced controls and stabilization to a system that can help our warfighters in the near future.”

(Photo: Randy Montoya)

Sandia National Laboratories

STUDY OF ISOLATED SNAKES COULD HELP SHED LIGHT ON VENOM COMPOSITION

0 comentarios

While studying a way to more safely and effectively collect snake venom, University of Florida researchers have noticed the venom delivered by an isolated population of Florida cottonmouth snakes may be changing in response to their diet.

Scientists used a portable nerve stimulator to extract venom from anesthetized cottonmouths, producing more consistent extraction results and greater amounts of venom, according to findings published in August in the journal Toxicon.

The study of venoms is important for many reasons, scientists say.

“The human and animal health benefits include understanding the components of venom that cause injury and developing better antivenin,” said Darryl Heard, an associate professor in the UF College of Veterinary Medicine’s department of small animal clinical sciences. “In addition, the venom components have the potential to be used for diagnostic tests and the development of new medical compounds.”

But in addition to showing the extraction method is safer, more effective and less stressful to both snake and handler than the traditional “milking” technique, Heard and Ryan McCleary, a Ph.D. candidate in biology in UF’s College of Liberal Arts and Sciences, discovered the venom from these particular snakes differs from that of mainland snakes, likely because of their unique diet of dead fish dropped by seabirds.

Heard and McCleary collaborated to develop a safe, reliable and humane technique for collecting venom from cottonmouths as part of a larger study on a specific population of snakes that reside on Seahorse Key, an isolated island near Cedar Key on the Florida’s Gulf Coast.

The venom collection study included data from 49 snakes on Seahorse Key.

“Snakes on this island are noted for their large size,” said Heard, a zoological medicine veterinarian with additional expertise in anesthesia. He added that Harvey Lillywhite, a professor of biology at UF and McCleary’s predoctoral adviser, has confirmed that cottonmouths on Seahorse Key eat primarily dead fish dropped by birds in a large seabird rookery.

Lillywhite also directs UF’s Seahorse Key Marine Laboratory, located in the Cedar Keys National Wildlife Refuge. McCleary hopes to build on earlier studies about the snakes’ ecology and to explore whether evolutionary changes may have affected the composition of the snakes’ venom.

“My interest is in the evolutionary aspect,” McCleary said. “If these snakes already have an abundant source of dead prey, why do they need venom?”

Preliminary findings show some differences in venom components, he added.

Traditionally, venom has been collected from venomous snakes by manually restraining the animal behind the head and having it bite a rubber membrane connected to a collecting chamber.

“This requires the capture of an awake snake, which increases the risk of human envenomation and is also stressful to the snake,” Heard said, adding that manual collection of venom also does not guarantee that all of the venom is collected.

(Photo: Sarah Kiewel/University of Florida)

CLIMATE CHANGE MEANS NORTHERN HIGH LATITUDES WILL RECEIVE LESS ULTRAVIOLET RADIATION

0 comentarios
Physicists at the University of Toronto have discovered that changes in the Earth's ozone layer due to climate change will reduce the amount of ultraviolet (UV) radiation in northern high latitude regions such as Siberia, Scandinavia and northern Canada. Other regions of the Earth, such as the tropics and Antarctica, will instead face increasing levels of UV radiation.

"Climate change is an established fact, but scientists are only just beginning to understand its regional manifestations," says Michaela Hegglin, a postdoctoral fellow in the Department of Physics and the lead author of the study published this month in Nature Geoscience.

Using a sophisticated computer model, Hegglin and U of T physicist Theodore Shepherd determined that 21st-century climate change will alter atmospheric circulation, increasing the flux of ozone from the upper to the lower atmosphere and shifting the distribution of ozone within the upper atmosphere. The result will be a change in the amount of UV radiation reaching the Earth's surface which varies dramatically between regions: e.g., up to a 20 per cent increase in UV radiation over southern high latitudes during spring and summer and a nine per cent decrease in UV radiation over northern high latitudes by the end of the century.

While the effects of increased UV have been widely studied because of the problem of ozone depletion, decreased UV could have adverse effects too, e.g., on vitamin D production for people in regions with limited sunlight such as the northern high latitudes.

"Both human and ecosystem health are affected by air quality and by UV radiation," said Shepherd. "While there has been much research on the impact of climate change on air quality, our work shows that this research needs to include the effect of changes in stratospheric ozone. And while there has been much research on the impact of ozone depletion on UV radiation and its impacts on human and ecosystem health, the notion that climate change could also affect UV radiation has not previously been considered. This adds to the list of potential impacts of climate change, and is especially important for Canada as northern high latitudes are particularly affected."

University of Toronto

ROME WAS BUILT IN A DAY, WITH HUNDREDS OF THOUSANDS OF DIGITAL PHOTOS

0 comentarios

The ancient city of Rome was not built in a day. It took nearly a decade to build the Colosseum, and almost a century to construct St. Peter's Basilica. But now the city, including these landmarks, can be digitized in just a matter of hours.

A new computer algorithm developed at the University of Washington uses hundreds of thousands of tourist photos to automatically reconstruct an entire city in about a day.

The tool is the most recent in a series developed at the UW to harness the increasingly large digital photo collections available on photo-sharing Web sites. The digital Rome was built from 150,000 tourist photos tagged with the word "Rome" or "Roma" that were downloaded from the popular photo-sharing Web site, Flickr.

Computers analyzed each image and in 21 hours combined them to create a 3-D digital model. With this model a viewer can fly around Rome's landmarks, from the Trevi Fountain to the Pantheon to the inside of the Sistine Chapel.

"How to match these massive collections of images to each other was a challenge," said Sameer Agarwal, a UW acting assistant professor of computer science and engineering and lead author of a paper being presented in October at the International Conference on Computer Vision in Kyoto, Japan. Until now, he said, "even if we had all the hardware we could get our hands on and then some, a reconstruction using this many photos would take forever."

Earlier versions of the UW photo-stitching technology are known as Photo Tourism. That technology was licensed in 2006 to Microsoft, which now offers it as a free tool called Photosynth.

"With Photosynth and Photo Tourism, we basically reconstruct individual landmarks. Here we're trying to reconstruct entire cities," said co-author Noah Snavely, who developed Photo Tourism as his UW doctoral work and is now an assistant professor at Cornell University.

Other co-authors of the new paper are Rick Szeliski of Microsoft Research, UW computer science professor Steve Seitz and UW graduate student Ian Simon.

In addition to Rome, the team recreated the Croatian coastal city of Dubrovnik, processing 60,000 images in less than 23 hours using a cluster of 350 computers, and Venice, Italy, processing 250,000 images in 65 hours using a cluster of 500 computers. Many historians see Venice as a candidate for digital preservation before water does more damage to the city, the researchers said.

Transitioning from landmarks to cities -- going from hundreds of photos to hundreds of thousands of photos -- is not trivial. Previous versions of the Photo Tourism software matched each photo to every other photo in the set. But as the number of photos increases the number of matches explodes, increasing with the square of the number of photos. A set of 250,000 images would take at least a year for 500 computers to process, Agarwal said. A million photos would take more than a decade.

The newly developed code works more than a hundred times faster than the previous version. It first establishes likely matches and then concentrates on those parts. The code also uses parallel processing techniques, allowing it to run simultaneously on many computers, or even on remote servers connected through the Internet.

The new, faster code makes it possible to tackle more ambitious projects.

"If a city reconstruction took several months, it would be just about building Rome," Seitz said. "But on a timeline of one day you can methodically start going through all the cities and start building models of them."

This technique could create online maps that offer viewers a virtual-reality experience. The software could build cities for video games automatically, instead of doing so by hand. It also might be used in architecture for digital preservation of cities, or integrated with online maps, Seitz said.

In the near term, the "Rome in a Day" code could be used with Photo Tourism, Photosynth or other software designed to view the model output.

(Photo: University of Washington)

University of Washington

TRANSITION FROM EGG-LAYING TO LIVE-BORN

0 comentarios

A new analysis of extinct sea creatures suggests that the transition from egg-laying to live-born young opened up evolutionary pathways that allowed these ancient species to adapt to and thrive in open oceans.

The evolutionary sleuthing is described in the journal Nature by scientists at Harvard University and the University of Reading who also report that the evolution of live-born young depended crucially on the advent of genes — rather than incubation temperature — as the primary determinant of offspring sex.

Having drawn this link in three lineages of extinct marine reptiles — mosasaurs, sauropterygians, and ichthyosaurs — the scientists say that genetic, or chromosomal, sex determination may have played a surprisingly strong role in adaptive radiations and the colonization of the world’s oceans by a diverse array of species.

“Determining sex with genetic mechanisms allowed marine reptiles to give live birth, in the water, as opposed to laying eggs on a nesting beach,” says Chris Organ, a research fellow in Harvard’s Department of Organismic and Evolutionary Biology. “This freed these species from the need to move and nest on land. As a consequence, extreme physical adaptations evolved in each group, such as the fluked tails, dorsal fins, and the winglike limbs of ichthyosaurs.”

Mosasaurs, sauropterygians, and ichthyosaurs invaded the Mesozoic seas between 251 million and 100 million years ago. All three groups of extinct marine reptiles breathed air, but evolved other adaptations to life in the open ocean, such as fin-shaped limbs, streamlined bodies, and changes in bone structure. Some evolved into enormous predators, such as porpoiselike ichthyosaurs that grew to more than 20 meters in length. Ichthyosaurs, and possibly mosasaurs, even evolved tail-first birth, an adaptation that helps modern whales and porpoises avoid drowning during birth.

“Losing the requirement of dry land during the life cycle of ichthyosaurs and other marine reptiles freed them to lead a completely aquatic existence, a shift that seems advantageous in light of the diversification that followed,” says Daniel E. Janes, a research associate in Harvard’s Department of Organismic and Evolutionary Biology.

Even though populations of most animals have males and females, the way sex is determined in offspring varies. Some animals rely primarily on sex chromosomes, as in humans where two X chromosomes make a female and an X and a Y chromosome make a male. Among living marine species, whales, porpoises, manatees, and sea snakes have chromosomal sex determination.

In sea turtles and saltwater crocodiles, on the other hand, the sex of offspring is generally determined by the temperature at which eggs incubate. These species are also bound to a semiterrestrial existence because their gas-exchanging hard-shelled eggs must be deposited on land.

“No one has clearly understood how sex determination has co-evolved with live birth and egg laying,” Organ says.

Organ, Janes, and colleagues show that evolution of live birth in a species depends on the prior evolution of genetic sex determination. Because the fossilized remains of pregnant mosasaurs, sauropterygians, and ichthyosaurs show that these species gave birth to live young, they must also have employed genetic sex determination, a point on which the fossil record is silent.

(Photo: Mark Sloan/HMNH)

Harvard University

DAILY BATHROOM SHOWERS MAY DELIVER FACE FULL OF PATHOGENS

0 comentarios

While daily bathroom showers provide invigorating relief and a good cleansing for millions of Americans, they also can deliver a face full of potentially pathogenic bacteria, according to a surprising new University of Colorado at Boulder study.

The researchers used high-tech instruments and lab methods to analyze roughly 50 showerheads from nine cities in seven states that included New York City, Chicago and Denver. They concluded about 30 percent of the devices harbored significant levels of Mycobacterium avium, a pathogen linked to pulmonary disease that most often infects people with compromised immune systems but which can occasionally infect healthy people, said CU-Boulder Distinguished Professor Norman Pace, lead study author.

It's not surprising to find pathogens in municipal waters, said Pace. But the CU-Boulder researchers found that some M. avium and related pathogens were clumped together in slimy "biofilms" that clung to the inside of showerheads at more than 100 times the "background" levels of municipal water. "If you are getting a face full of water when you first turn your shower on, that means you are probably getting a particularly high load of Mycobacterium avium, which may not be too healthy," he said.

The study appeared in the Sept. 14 online edition of the Proceedings of the National Academy of Sciences. Co-authors of the study included CU-Boulder researchers Leah Feazel, Laura Baumgartner, Kristen Peterson and Daniel Frank and University Colorado Denver pediatrics department Associate Professor Kirk Harris. The study is part of a larger effort by Pace and his colleagues to assess the microbiology of indoor environments and was supported by the Alfred P. Sloan Foundation.

Research at National Jewish Hospital in Denver indicates that increases in pulmonary infections in the United States in recent decades from so-called "non-tuberculosis" mycobacteria species like M. avium may be linked to people taking more showers and fewer baths, said Pace. Water spurting from showerheads can distribute pathogen-filled droplets that suspend themselves in the air and can easily be inhaled into the deepest parts of the lungs, he said.

Symptoms of pulmonary disease caused by M. avium can include tiredness, a persistent, dry cough, shortness of breath, weakness and "generally feeling bad," said Pace. Immune-compromised people like pregnant women, the elderly and those who are fighting off other diseases are more prone to experience such symptoms, said Pace, a professor in the molecular, cellular and developmental biology department.

The CU-Boulder researchers sampled showerheads in homes, apartment buildings and public places in New York, Illinois, Colorado, Tennessee and North Dakota.

Although scientists have tried cell culturing to test for showerhead pathogens, the technique is unable to detect 99.9 percent of bacteria species present in any given environment, said Pace. A molecular genetics technique developed by Pace in the 1990s allowed researchers to swab samples directly from the showerheads, isolate DNA, amplify it using the polymerase chain reaction, or PCR, and determine the sequences of genes present in order to pinpoint particular pathogen types.

"There have been some precedents for concern regarding pathogens and showerheads," said Pace. "But until this study we did not know just how much concern."

During the early stages of the study, the CU team tested showerheads from smaller towns and cities, many of which were using well water rather than municipal water. "We were starting to conclude that pathogen levels we detected in the showerheads were pretty boring," said Feazel, first author on the study. "Then we worked up the New York data and saw a lot of M. avium. It completely reinvigorated the study."

In addition to the showerhead swabbing technique, Feazel took several individual showerheads, broke them into tiny pieces, coated them with gold, used a fluorescent dye to stain the surfaces and used a scanning electron microscope to look at the surfaces in detail. "Once we started analyzing the big metropolitan data, it suddenly became a huge story to us," said Feazel, who began working in Pace's lab as an undergraduate.

In Denver, one showerhead in the study with high loads of the pathogen Mycobacterium gordonae was cleaned with a bleach solution in an attempt to eradicate it, said Pace. Tests on the showerhead several months later showed the bleach treatment ironically caused a three-fold increase in M. gordonae, indicating a general resistance of mycobacteria species to chlorine.

Previous studies by Pace and his group found massive enrichments of M. avium in "soap scum" commonly found on vinyl shower curtains and floating above the water surface of warm therapy pools. A 2006 therapy pool study led by Pace and CU-Boulder Professor Mark Hernandez showed high levels of M. avium in the indoor pool environment were linked to a pneumonia-like pulmonary condition in pool attendants known as "lifeguard lung," leading the CU team into the showerhead study, said Pace.

Additional studies under way by Pace's team include analyses of air in New York subways, hospital waiting rooms, office buildings and homeless shelters. Indoor air typically has about 1 million bacteria per cubic meter and municipal tap water has rough 10 million bacteria per cubic meter, said Pace.

So is it dangerous to take showers? "Probably not, if your immune system is not compromised in some way," said Pace. "But it's like anything else -- there is a risk associated with it." Pace said since plastic showerheads appear to "load up" with more pathogen-enriched biofilms, metal showerheads may be a good alternative.

"There are lessons to be learned here in terms of how we handle and monitor water," said Pace. "Water monitoring in this country is frankly archaic. The tools now exist to monitor it far more accurately and far less expensively that what is routinely being done today."

In 2001 the National Academy of Sciences awarded Pace the Selman Waxman Award -- considered the nation's highest award in microbiology -- for pioneering the molecular genetic techniques he now uses to rapidly detect, identify and classify microbe species using nucleic acid technology without the need for lab cultivation. That same year he was awarded a MacArthur Foundation "genius grant" for his work.

University of Colorado

Monday, September 28, 2009

PROMISING LINK BETWEEN WARMTH, BETTER MOODS PROBED BY CU-BOULDER SCIENTIST

0 comentarios
The University of Colorado at Boulder scientist who discovered that playing in the dirt might ease depression is probing the link between higher temperatures and elevated mood.

Christopher Lowry sees relationships between both lines of inquiry -- researching the link between the immune system and the neurotransmitter serotonin and probing the link between temperature and serotonin.

The upshot is potentially significant. Understanding these mechanisms might help scientists craft better treatments for depression and other mood disorders, he says.

Lowry, an assistant professor of integrative physiology, believes the area of research is promising. So does the National Science Foundation, which recently granted Lowry a $500,000 Faculty Early Career Development Award, a prestigious honor also called the CAREER Award, to continue his study of the role of temperature in mood.

"Whether lying on the beach in the midday sun on a Caribbean island, grabbing a few minutes in the sauna or spa after work or sitting in a hot bath or Jacuzzi in the evening, we often associate feeling warm with a sense of relaxation and well-being," Lowry writes in a recent edition of the Journal of Psychopharmacology.

"Intuitively, we all understand that temperature affects our mood," Lowry said. But a link has not been clearly defined. "So that's what we're going after."

Virtually all antidepressant drugs activate the serotonin system. Lowry's research group noted studies from the 1970s showing that warming a small piece of skin in rats caused increased activity in an area of the brain with serotonin-producing neurons. "So then we had a potential pathway," he said.

Lowry's lab has been a world leader in demonstrating that there are different subpopulations of serotonin-producing neurons, some associated with anxiety, others with panic, immune activation and antidepressant-like effects.

And while scientists know that serotonin is related to mood, appetite and aggression, they don't know exactly how the substance is involved. The same is true of antidepressants such as Prozac and other selective serotonin reuptake inhibitors.

"It's a complete black box how these drugs work, which I think many people might find surprising," Lowry says. "We think that if we understood what makes these serotonin neurons different from other neurons that we would then be in a position to develop rational new therapies for treatment of depression."

Several clues suggest a connection between temperature and mood, he says. People who are depressed often experience altered temperature cycles. Virtually all antidepressants can cause sweating, a thermoregulatory cooling mechanism typically triggered when a person gets warm.

This system may be activated by exercise. When you exercise, body temperatures rise, and you sweat. "That very likely involves some of the mechanisms that we're studying," Lowry says.

Several studies have shown that regular exercise has an antidepressant effect. "So they have studied exercise, but they haven't studied temperature change, which is a component of exercise."

Serotonin neurons can be activated by warm temperature externally, via the skin, and internally. The calming effect of body warmth seems to occur only up until the temperature becomes hazardous, around 104 degrees Fahrenheit. "So we think there's a link between the system that cools the body and a sense of relaxation."

University of Colorado at Boulder

ELECTRONIC NOSE SNIFFS OUT TOXINS

0 comentarios

Imagine a polka-dotted postage stamp-sized sensor that can sniff out some known poisonous gases and toxins and show the results simply by changing colors.

Support for the development and application of this electronic nose comes from the National Institute of Environmental Health Sciences, part of the National Institutes of Health. The new technology is discussed in this month's issue of Nature Chemistry and exemplifies the types of sensors that are being developed as part of the NIH Genes, Environment and Health Initiative (GEI) (http://www.gei.nih.gov/index.asp).

Once fully developed, the sensor could be useful in detecting high exposures to toxic industrial chemicals that pose serious health risks in the workplace or through accidental exposure. While physicists have radiation badges to protect them in the workplace, chemists and workers who handle chemicals do not have equivalent devices to monitor their exposure to potentially toxic chemicals. The investigators hope to be able to market the wearable sensor within a few years.

"The project fits into the overall goal of a component of the GEI Exposure Biology Program that the NIEHS has the lead on, which is to develop technologies to monitor and better understand how environmental exposures affect disease risk," said NIEHS Director Linda Birnbaum, Ph.D. "This paper brings us one step closer to having a small wearable sensor that can detect multiple airborne toxins."

The paper's senior author is Kenneth S. Suslick, Ph.D., the M.T. Schmidt Professor of Chemistry at the University of Illinois at Urbana-Champaign. Suslick and his colleagues have created what they refer to as an optoelectronic nose, an artificial nose for the detection of toxic industrial chemicals (TICs) that is simple, fast, inexpensive, and works by visualizing colors.

"We have a disposable 36-dye sensor array that changes colors when exposed to different chemicals. The pattern of the color change is a unique molecular fingerprint for any toxic gas and also tells us its concentration," said Suslick. "By comparing that pattern to a library of color fingerprints, we can identify and quantify the TICs in a matter of seconds."

The researchers say older methods relied on sensors whose response originates from weak and highly non-specific chemical interactions, whereas this new technology is more responsive to a diverse set of chemicals. The power of this sensor to identify so many volatile toxins stems from the increased range of interactions that are used to discriminate the response of the array.

To test the application of their color sensor array, the researchers chose 19 representative examples of toxic industrial chemicals. Chemicals such as ammonia, chlorine, nitric acid and sulfur dioxide at concentrations known to be immediately dangerous to life or health were included. The arrays were exposed to the chemicals for two minutes. Most of the chemicals were identified from the array color change in a number of seconds and almost 90 percent of them were detected within two minutes.

The laboratory studies used inexpensive flatbed scanners for imaging. The researchers have developed a fully functional prototype handheld device that uses inexpensive white LED illumination and an ordinary camera, which will make the whole process of scanning more sensitive, smaller, faster, and even less expensive. It will be similar to a card scanning device.

"One of the nice things about this technology is that it uses components that are readily available and relatively inexpensive," said David Balshaw, Ph.D., a program administrator at the NIEHS. "Given the broad range of chemicals that can be detected and the high sensitivity of the array to those compounds, it appears that this device will be particularly useful in occupational settings."

(Photo: NIH)

The National Institutes of Health

SPACE-RELATED RADIATION RESEARCH COULD HELP REDUCE FRACTURES IN CANCER SURVIVORS

0 comentarios

A research project looking for ways to reduce bone loss in astronauts may yield methods of improving the bone health of cancer patients undergoing radiation treatment.

It is well documented that living in the microgravity environment of space causes bone loss in astronauts, but until recently, little was known about the effects of space radiation on bones. Dr. Ted Bateman leads a project funded by the National Space Biomedical Research Institute (NSBRI) to understand radiation-induced bone loss and to determine which treatments can be used to reduce that loss and lower the risk of fractures.

“Our studies indicate significant bone loss at the radiation levels astronauts will experience during long missions to the moon or Mars,” said Bateman, a member of NSBRI’s Musculoskeletal Alterations Team.

Bateman, an associate professor of bioengineering at Clemson University, and colleagues at Clemson and Loma Linda University have discovered in experiments with mice that bone loss begins within days of radiation exposure through activation of bone-reducing cells called osteoclasts. Under normal conditions, these cells work with bone-building cells, called osteoblasts, to maintain bone health.

“Our research challenges some conventional thought by saying radiation turns on the bone-eating osteoclasts,” Bateman said. “If that is indeed the case, existing treatments, such as bisphosphonates, may be able to prevent this early loss of bone.”

Bisphosphonates are used to prevent loss of bone mass in patients who have osteoporosis or other bone disorders.

Even though the research is being performed to protect the health of NASA astronauts, cancer patients, especially those who receive radiation therapy in the pelvic region, could benefit from the research.

“We know that older women receiving radiotherapy to treat pelvic tumors are particularly vulnerable to fracture, with hip fracture rates increasing 65 percent to 200 percent in these cancer patients,” said Bateman. “Hip fractures are very serious; nearly one in four patients who fracture a hip will not survive a year. A large number of surviving patients will require long-term care. More than 80 percent of the patients will not be able to walk unaided or will not be back to pre-fracture activity levels after a year.”

Once a person loses bone, their long-term fracture risk depends on their ability to recover lost bone mass. For older cancer patients, early introduction of bisphosphonates and other forms of treatment could help greatly since the process of regaining bone mass can be more difficult due to lower activity levels.

Clemson’s Dr. Jeff Willey is a collaborator with Bateman and the lead investigator of an NSBRI-funded project looking at the cellular mechanisms involved in radiation-induced bone loss. He said the bone loss in the spaceflight-related experiments has occurred quickly and cell physiology has changed.

“If we expose mice to a relatively low dose of radiation, the cells that break down bone are turned on several days after exposure,” he said. “After radiation exposure, osteoclasts appear to have a different shape. They get flatter, and there are certainly more of them.”

The mice used in the research have received the amount of radiation exposure that is expected to occur during a lengthy mission to the moon or Mars. The amount is much less than what cancer patients receive during treatment. For example, patients receiving radiation treatment in the pelvic region can receive doses up to 80 gray over a six- to eight-week period, with the hip receiving up to 25 gray. Astronauts are likely to receive about 0.5 to 1 gray during a long-duration lunar or martian mission.

Astronauts are at risk of radiation exposure from two sources. The first is proton radiation from the sun. The second, and less understood type, is galactic cosmic radiation from sources outside the galaxy. Galactic cosmic rays and protons would be the source of radiation damage for astronauts during a mission to Mars.

Marcelo Vazquez, NSBRI’s senior scientist for space radiation research, said Bateman’s project and other NSBRI radiation projects will influence spacecraft design and mission planning. “The research will help to define the radiation risks for astronauts during long-term missions,” Vazquez said. “This will lead to strategies for shielding and medical countermeasures to protect against exposure.”

Bateman’s NSBRI work is leading to other studies. “We have been able to initiate a couple of clinical trials with cancer patients to determine if what we are seeing in mice corresponds with bone loss in humans. Preliminary results in these trials show rapid declines in bone mass and strength,” Bateman said.

(Photo: Patrick Wright/Clemson University)

National Space Biomedical Research Institute

NEW NASA TEMPERATURE MAPS PROVIDE 'WHOLE NEW WAY OF SEEING THE MOON'

0 comentarios

NASA's Lunar Reconnaissance Orbiter (LRO), an unmanned mission to comprehensively map the entire moon, has returned its first data. One of the seven instruments aboard, the Diviner Lunar Radiometer Experiment, is making the first global survey of the temperature of the lunar surface while the spacecraft orbits some 31 miles above the moon.

Diviner has obtained enough data already to characterize many aspects of the moon's current thermal environment. The instrument has revealed richly detailed thermal behavior, throughout both the north and south polar regions, that extends to the limit of Diviner's spatial resolution of just a few hundred yards.

"Most notable are the measurements of extremely cold temperatures within the permanently shadowed regions of large polar impact craters in the south polar region," said David Paige, Diviner's principal investigator and a UCLA professor of planetary science. "Diviner has recorded minimum daytime brightness temperatures in portions of these craters of less than -397 degrees Fahrenheit. These super-cold brightness temperatures are, to our knowledge, among the lowest that have been measured anywhere in the solar system, including the surface of Pluto."

"After decades of speculation, Diviner has given us the first confirmation that these strange, permanently dark and extremely cold places actually exist on our moon," said science team member Ashwin Vasavada of NASA's Jet Propulsion Laboratory in Pasadena, Calif. "Their presence greatly increases the likelihood that water or other compounds are frozen there. Diviner has lived up to its name."

These observations, made during Diviner's "commissioning phase," provide a snapshot in time of current polar temperatures that will evolve with the lunar seasons.

"It is safe to conclude that the temperatures in these super-cold regions are definitely low enough to cold-trap water ice, as well as other more volatile compounds, for extended periods," Paige said. "The existence of such cold traps has been predicted theoretically for almost 50 years. Diviner is now providing detailed information regarding their spatial distribution and temperatures."

Diviner's thermal observations represent one component of the LRO's strategy for determining the nature and distribution of cold-trapped water ice in the lunar polar regions. Future comparisons between Diviner data, physical models and other polar data sets may provide important scientific conclusions regarding the nature and history of the moon's polar cold traps and any cold-trapped volatile materials they contain, Paige said.

The moon's surface temperatures are among the most extreme of any planetary body in the solar system. Noontime surface temperatures near the lunar equator are hotter than boiling water, while nighttime surface temperatures on the moon are almost as cold as liquid oxygen. It has been estimated that near the lunar poles, in areas that never receive direct sunlight, temperatures can dip to within a few tens of degrees of absolute zero.

Data accumulated by Diviner during August and the first half of September indicate that equatorial and mid-latitude daytime temperatures are 224 degrees Fahrenheit, and then decrease sharply poleward of 70 degrees north latitude. Equatorial and mid-latitude nighttime temperatures are -298 degrees Fahrenheit, and then decrease poleward of 80 degrees north latitude. At low and mid-latitudes, there are isolated warmer regions with nighttime temperatures of -208 degrees Fahrenheit.

"These correspond to the locations of larger, fresh impact craters that have excavated rocky material that remains significantly warmer than the surrounding lunar soil throughout the long lunar night," Paige said.

The thermal behavior at high latitudes contrasts sharply with that of the equatorial and mid-latitudes. Close to the poles, both daytime and nighttime temperatures are strongly influenced by local topography, and the thermal outlines of many partially illuminated impact craters are apparent.

"Getting a look at the first global thermal maps of the lunar surface is a whole new way of seeing the moon," Paige said.

NASA's LRO launched June 18. Diviner has been mapping the moon continuously during the LRO commissioning phase. Since the instrument was first activated on July 5, it has acquired more than 8 billion calibrated radiometric measurements and has mapped almost 50 percent of the surface area of the moon.

"The performance of the instrument has been excellent, and closely matches our predictions," said instrument engineer Marc Foote of JPL.

"We have already accumulated an enormous amount of high-quality data," Paige said.

There are large gaps between Diviner's individual ground tracks at the equator, but in the polar regions, the ground tracks overlap to create continuous high-resolution maps. During the commissioning phase, the plane of the LRO orbit moved from 5:40 to 1:10 a.m. and p.m., on the night side and day side, respectively. It will take about six months for LRO’s orbit to sample the full range of lunar local times.

In addition to mapping the moon, Diviner executed a series of specialized calibration sequences during the commissioning phase. These included scans of the "limb," or visible edge of the moon to better define the instruments' fields of view and an infrared panorama of a portion of the LRO spacecraft, as well as infrared scans of the Earth from lunar orbit, which are presently being analyzed.

"Diviner has been put through her paces and has executed our commands brilliantly," said JPL scientist and lead observational sequence designer Benjamin Greenhagen. "Diviner's operations have run very smoothly."

During the course of LRO's mapping mission, Diviner will map the entire surface of the moon at high resolution to create the first global picture of the current thermal state of the moon and its daily and seasonal variability.

The moon's extreme temperature environment is of interest to future human and robotic explorers, especially if they plan to visit the moon for extended periods. Detailed thermal maps of the moon can also yield information regarding the locations of rocky areas that may be hazardous to landing vehicles and details for mapping compositional variations in lunar rocks and soils. In the moon's polar regions, temperature maps also point to the locations of cold traps where water ice and other volatile materials may have accumulated. Mapping the locations of these lunar cold traps and searching for the presence of frozen water are among the main goals of the LRO mission.

Diviner is operated by the California Institute of Technology Jet Propulsion Laboratory (JPL), which designed and built the instrument.

Diviner determines the temperature of the moon by measuring the intensity of infrared radiation emitted by the lunar surface. The hotter the surface, the greater the intensity of emitted infrared radiation. Diviner measures infrared radiation in seven infrared channels that cover an enormous wavelength range, from 7.6 to 400 microns. Diviner is the first instrument designed to measure the full range of lunar surface temperatures, from the hottest to the coldest. Diviner also includes two solar channels (channels 1 and 2) that measure the intensity of reflected solar radiation.

(Photo: NASA/GSFC/UCLA)

UCLA

ARCTIC SEA ICE REACHES MINIMUM EXTENT FOR 2009, THIRD LOWEST EVER RECORDED

0 comentarios
The Arctic sea ice cover appears to have reached its minimum extent for the year, the third-lowest recorded since satellites began measuring sea ice extent in 1979, according to the University of Colorado at Boulder's National Snow and Ice Data Center.

While this year's September minimum extent was greater than each of the past two record-setting and near-record-setting low years, it is still significantly below the long-term average and well outside the range of natural climate variability, said NSIDC Research Scientist Walt Meier. Most scientists believe the shrinking Arctic sea ice is tied to warming temperatures caused by an increase in human-produced greenhouse gases being pumped into Earth's atmosphere.

Atmospheric circulation patterns helped the Arctic sea ice spread out in August to prevent another record-setting minimum, said Meier. But most of the 2009 September Arctic sea ice is thin first- or second-year ice, rather than thicker, multi-year ice that used to dominate the region, said Meier.

The minimum 2009 sea-ice extent is still about 620,000 square miles below the average minimum extent measured between 1979 and 2000 -- an area nearly equal to the size of Alaska, said Meier. "We are still seeing a downward trend that appears to be heading toward ice-free Arctic summers," Meier said.

CU-Boulder's NSIDC will provide more detailed information in early October with a full analysis of the 2009 Arctic ice conditions, including aspects of the melt season and conditions heading into the winter ice-growth season. The report will include graphics comparing 2009 to the long-term Arctic sea-ice record.

University of Colorado

IMPACT OF RENEWABLE ENERGY ON OUR OCEANS MUST BE INVESTIGATED

0 comentarios

Scientists from the Universities of Exeter and Plymouth are calling for urgent research to understand the impact of renewable energy developments on marine life. The study, now published in the Journal of Applied Ecology, highlights potential environmental benefits and threats resulting from marine renewable energy, such as off-shore wind farms and wave and tidal energy conversion devices.

The research highlights the capacity for marine renewable energy devices to boost local biodiversity and benefit the wider marine environment. Man-made structures on the sea bed attract many marine organisms and sometimes become 'artificial reefs', for example, supporting a wide variety of fish. The study also points out that such devices could have negative environmental impacts, resulting from habitat loss, collision risks, noise and electromagnetic fields.

The study highlights the gaps in our understanding of the effects of marine renewable energy devices on the health of our oceans. The team calls for more research to improve our understanding of these threats and opportunities. The researchers also stress the importance of considering the impact on marine life when selecting locations for the installation of marine energy devices.

Corresponding author Dr Brendan Godley of the University of Exeter said: "Marine renewable energy is hugely exciting and it is vital that we explore the potential for it to provide a clean and sustainable energy source. However, to date research into the impact of marine renewable energy on sea life has been very limited. . Our study highlights the urgent need for more research into the impacts of marine renewable energy on marine life. This will involve biologists, engineers and policy-makers working together to ensure we really understand the risks and opportunities for marine life."

Professor Martin Attrill, Director of the University of Plymouth Marine Institute said: "Our paper highlights the need to take a fresh look at the effect marine renewable energy generation has on the environment if we are to deliver a higher proportion of energy from renewable sources and start to combat climate change. We need to have the industry working directly with conservation bodies to plan the next phase of development. We suggest further research could demonstrate the potential of security zones around, for example, wave farms to act as Marine Protected Areas. Therefore, if all stakeholders can work together in a coordinated way we can possibly address two key issues - combating climate change and creating a network of MPAs. We need the research on environmental impact to help move the whole field forward."

(Photo: Dr Matthew Witt, University of Exeter)

Universities of Exeter

Friday, September 25, 2009

SECRETS OF INSECT FLIGHT REVEALED

0 comentarios

Researchers are one step closer to creating a micro-aircraft that flies with the manoeuvrability and energy efficiency of an insect after decoding the aerodynamic secrets of insect flight.

Dr John Young, from the University of New South Wales (UNSW) in Australia, and a team of animal flight researchers from Oxford University's Department of Zoology, used high-speed digital video cameras to film locusts in action in a wind tunnel, capturing how the shape of a locust's wing changes in flight. They used that information to create a computer model which recreates the airflow and thrust generated by the complex flapping movement.

The breakthrough result, published in the journal Science, means engineers understand for the first time the aerodynamic secrets of one of Nature's most efficient flyers – information vital to the creation of miniature robot flyers for use in situations such as search and rescue, military applications and inspecting hazardous environments.

"The so-called `bumblebee paradox' claiming that insects defy the laws of aerodynamics, is dead. Modern aerodynamics really can accurately model insect flight," said Dr Young, a lecturer in the School of Aerospace, Civil and Mechanical Engineering at the Australian Defence Force Academy (UNSW@ADFA).

"Biological systems have been optimised through evolutionary pressures over millions of years, and offer many examples of performance that far outstrips what we can achieve artificially.

"An insect's delicately structured wings, with their twists and curves, and ridged and wrinkled surfaces, are about as far away as you can get from the streamlined wing of an aircraft," Dr Young said.

"Until very recently it hasn't been possible to measure the actual shape of an insect's wings in flight – partly because their wings flap so fast, and partly because their shape is so complicated.

"Locusts are an interesting insect for engineers to study because of their ability to fly extremely long distances on very limited energy reserves."

Once the computer model of the locust wing movement was perfected, the researchers ran modified simulations to find out why the wing structure was so complex.

In one test they removed the wrinkles and curves but left the twist, while in the second test they replaced the wings with rigid flat plates. The results showed that the simplified models produced lift but were much less efficient, requiring much more power for flight.

"The message for engineers working to build insect-like micro-air vehicles is that the high lift of insect wings may be relatively easy to achieve, but that if the aim is to achieve efficiency of the sort that enables inter-continental flight in locusts, then the details of deforming wing design are critical," Dr Young said.

(Photo: Animal Flight Group, Oxford University/John Young, UNSW@ADFA)

University of New South Wales

RESEARCHERS MAKE RARE METEORITE FIND USING NEW CAMERA NETWORK IN AUSTRALIAN DESERT

0 comentarios

Researchers have discovered an unusual kind of meteorite in the Western Australian desert and have uncovered where in the Solar System it came from, in a very rare finding published in the journal Science.

Meteorites are the only surviving physical record of the formation of our Solar System and by analysing them researchers can glean valuable information about the conditions that existed when the early Solar System was being formed. However, information about where individual meteorites originated, and how they were moving around the Solar System prior to falling to Earth, is available for only a dozen of around 1100 documented meteorite falls over the past two hundred years.

Dr Phil Bland, the lead author of today's study from the Department of Earth Science and Engineering at Imperial College London, said: "We are incredibly excited about our new finding. Meteorites are the most analysed rocks on Earth but it's really rare for us to be able to tell where they came from. Trying to interpret what happened in the early Solar System without knowing where meteorites are from is like trying to interpret the geology of Britain from random rocks dumped in your back yard."

The new meteorite, which is about the size of cricket ball, is the first to be retrieved since researchers from Imperial College London, Ondrejov Observatory in the Czech Republic, and the Western Australian Museum, set up a trial network of cameras in the Nullarbor Desert in Western Australia in 2006.

The researchers aim to use these cameras to find new meteorites, and work out where in the Solar System they came from, by tracking the fireballs that they form in the sky. The new meteorite was found on the first day of searching using the new network, by the first search expedition, within 100m of the predicted site of the fall. This is the first time a meteorite fall has been predicted using only the data from dedicated instruments.

The meteorite appears to have been following an unusual orbit, or path around the Sun, prior to falling to Earth in July 2007, according to the researchers' calculations. The team believes that it started out as part of an asteroid in the innermost main asteroid belt between Mars and Jupiter. It then gradually evolved into an orbit around the Sun that was very similar to Earth's. The other meteorites that researchers have data for follow orbits that take them back, deep into the main asteroid belt.

The new meteorite is also unusual because it is composed of a rare type of basaltic igneous rock. The researchers say that its composition, together with the data about where the meteorite comes from, fits with a recent theory about how the building blocks for the terrestrial planets were formed. This theory suggests that the igneous parent asteroids for meteorites like today's formed deep in the inner Solar System, before being scattered out into the main asteroid belt. Asteroids are widely believed to be the building blocks for planets like the Earth so today's finding provides another clue about the origins of the Solar System.

The researchers are hopeful that their new desert network could yield many more findings, following the success of their first meteorite search.

Dr Bland added: "We're not the first team to set up a network of cameras to track fireballs, but other teams have encountered problems because meteorites are small rocks and they're hard to find in vegetated areas. Our solution was quite simple - build a fireball network in a place where it's easy to find them. The Nullarbour Desert is ideal because there's very little vegetation and dark rocks show up really easily on the light desert plain.

"It was amazing to find a meteorite that we could track back to its origin in the asteroid belt on our first expedition using our small trial network. We're cautiously optimistic that this find could be the first of many and if that happens, each find may give us more clues about how the Solar System began," said Dr Bland.

The researchers' network of cameras takes a single time-lapse picture every night to record any fireballs in the sky. When a meteorite falls, researchers can then use complex calculations to uncover what orbit the meteorite was following and where the meteorite is likely to have landed, so that they can retrieve it.

(Photo: ICL)

Imperial College London

A TINY TYRANNOSAUR

0 comentarios

When you think of Tyrannosaurus rex, a small set of striking physical traits comes to mind: an oversized skull with powerful jaws, tiny forearms, and the muscular hind legs of a runner. But, researchers have just unearthed a much smaller tyrannosauroid in China, no more than three meters long, that displays all the same features – and it predates the T. rex by tens of millions of years.

This finding, published online by the journal Science at the Science Express website on September 17, means that such specialized physical features did not evolve as the prehistoric predators grew in size. Instead, they were present for feeding efficiency at all sizes of the dinosaurs during their reign in the Cretaceous Period.

Paul Sereno from the University of Chicago and National Geographic explorer-in-residence, along with colleagues, studied the new, small-bodied fossil, naming it Raptorex kriegsteini, and estimated that it was a young adult when it died. They examined the skull, teeth, nose, spine, shoulders, forearms, pelvis, and hind legs of the new fossil, comparing the features to larger evolutionary versions of tyrannosauroid dinosaurs.

"First, we used the best mechanical preparation of the specimen possible, which entails the finest needles and air abrasives under a microscope," Sereno said in an email interview. "Then we made molds and casts of the cranial bones, assembled a cast skull, and sent that skull through a CT scanner at the University of Chicago hospital to get the snout cross-section… We used silicone on the skull roof to cast the forebrain of R. kriegsteini… Finally, I made a thin-section from one femur, or thigh bone, for microscopic examination, and determined that the individual had lived to be five or six years old."

The researchers conclude that the "predatory skeletal design" of R. kriegsteini was simply scaled up with little modification in its carnivorous descendants, whose body masses eventually grew 90 times greater.

Sereno and his colleagues also use this new fossil to propose and describe three major morphological stages in the evolutionary history of tyrannosauroid dinosaurs.

(Photo: Todd Marshall)

FOR CARNIVOROUS PLANTS, SLOW BUT STEADY WINS THE RACE

0 comentarios

Like the man-eating plant in Little Shop of Horrors, carnivorous plants rely on animal prey for sustenance. Fortunately for humans, carnivorous plants found in nature are not dependent on a diet of human blood but rather are satisfied with the occasional fly or other insect. The existence of carnivorous plants has fascinated botanists and non-botanists alike for centuries and raises the question, "Why are some plants carnivorous?"

A recent article by Drs. Jim Karagatzides and Aaron Ellison in the September issue of the American Journal of Botany (www.amjbot.org/cgi/content/full/96/9/1612) addresses this question. As Ellison stated, "The general answer to this is that in environments that have few nutrients (such as bogs, where we study carnivorous plants), carnivory allows these plants to capture nutrients 'on the wing'. But if it's so good to be a carnivorous plant in these kinds of environments, why aren't there more carnivorous plants? Knowing how much it 'costs' a carnivorous plant to make a trap is a key piece of information needed to understand why there aren't more carnivorous plants."

Elllison and Karagatzides simultaneously measured both costs and benefits for traps, leaves, roots, and rhizomes of 15 different carnivorous plant species, including pitcher plants and the Venus fly trap. By measuring the construction cost of carbon needed to create these plant structures and comparing it to the payback time—the amount of time the structure takes to photosynthesize to recoup the carbon used in its construction—Ellison and Karagatzides were able to determine how beneficial a trap might be to a plant.

Contrary to expectations, the average cost to create a trap was actually significantly lower than the cost to create a leaf. Ellison said, "The most interesting result is that carnivorous traps are 'cheap' to make (at least compared with leaves). Models of the evolution of carnivory in plants have suggested that traps should be 'expensive' structures—they take a lot of carbon and nutrients to make, and so only when they can't recover these costs in any other way should carnivory be adaptive (or evolutionarily favored). But because carnivorous plants have very low rates of photosynthesis, it still takes a very long time for the plants to 'pay' for them (by accumulating new carbon through photosynthesis)."

Understanding how carbon and mineral nutrients are allocated among different plant organs, different species, and vegetation of different biomes is one of the major goals of the field of plant ecology. Carnivorous plants are a model organism to study these carbon and mineral nutrient tradeoffs because the plants inhabit open environments where light and water are not limiting but nutrients are in extremely short supply, and therefore it is relatively easy to separate out experimentally the effects of nutrient limitation from effects of limitation of light or water.

Ellison and Karagatzides's findings advance our understanding of how complete food webs function. Ellison stated, "Nicholas Gotelli [from the University of Vermont] and I, along with our students and colleagues, have spent more than 10 years developing this micro-ecosystem as a model for how complete food webs—including the plant as both producer and habitat, and the aquatic food web that lives in the pitchers as both detritivore and mutualist—and aquatic ecosystems actually work. These studies have provided new insights into population dynamics and extinction, the importance of food webs for persistence of top predators, and now how organisms allocate nutrients to better control and modulate energy and nutrient fluxes across ecosystem boundaries."

American Journal of Botany

FAKE VIDEO DRAMATICALLY ALTERS EYEWITNESS ACCOUNTS

0 comentarios

Associate Professor Dr Kimberley Wade from the Department of Psychology led an experiment to see whether exposure to fabricated footage of an event could induce individuals to accuse another person of doing something they never did.

In the study, published in Applied Cognitive Psychology, Dr Wade found that almost 50% of people shown fake footage of an event they witnessed first hand were prepared to believe the video version rather than what they actually saw.

Dr Wade’s research team filmed 60 subjects as they took part in a computerised gambling task. The subjects were unknowingly seated next to a member of the research team as they both separately answered a series of multiple-choice general knowledge questions.

All subjects were given a pile of fake money to gamble with and they shared a pile of money that represented the bank. Their task was to earn as much money as possible by typing in an amount of money to gamble on the chances of them answering each question correctly. They were told the person who made the highest profit would win a prize.

When they answered each question, subjects saw either a green tick on their computer monitor to show their answer was correct, or a red cross to show it was incorrect. If the answer was wrong, they would be told to return the money to the bank.

After the session, the video footage was doctored to make it look as if the member of the research team sat next to the subject was cheating by not putting money back into the bank.

One third of the subjects were told that the person sat next to them was suspected of cheating. Another third were told the person had been caught on camera cheating, and the remaining group were actually shown the fake video footage. All subjects were then asked to sign a statement only if they had seen the cheating take place.

Nearly 40% of the participants who had seen the doctored video complied. Another 10% of the group signed when asked a second time by the researchers. Only 10% of those who were told the incident had been caught on film but were not shown the video agreed to sign, and about 5% of the control group who were just told about the cheating signed the statement.

Dr Wade said: “Over the previous decade we have seen rapid advances in digital-manipulation technology. As a result, almost anyone can create convincing, yet fake, images or video footage. Our research shows that if fake footage is extremely compelling, it can induce people to testify about something they never witnessed.”

University of Warwick

BIOLOGISTS DISCOVER 'DEATH STENCH' IS A UNIVERSAL ANCIENT WARNING SIGNAL

0 comentarios

The smell of recent death or injury that repels living relatives of insects has been identified as a truly ancient signal that functions to avoid disease or predators, biologists have discovered.

David Rollo, professor of biology at McMaster University, found that corpses of animals, from insects to crustaceans, all emit the same death stench produced by a blend of specific fatty acids.

The findings have been published in the journal Evolutionary Biology.

Rollo and his team made the discovery while they were studying the social behavior of cockroaches. When a cockroach finds a good place to live it marks the site with pheromone odours that attract others. In trying to identify the precise chemicals involved, Rollo extracted body juices from dead cockroaches.

"It was amazing to find that the cockroaches avoided places treated with these extracts like the plague," says Rollo. "Naturally, we wanted to identify what chemical was making them all go away."

The team eventually identified the specific chemicals that signaled death. Furthermore, they found that the same fatty acids not only signaled death in ants, caterpillars, and cockroaches, they were equally effective in terrestrial woodlice and pill bugs that are actually not insects but crustaceans related to crayfish and lobsters.

Because insects and crustaceans diverged more than 400-million years ago it is likely that most subsequent species recognize their dead in a similar way, that the origin of such signals was likely even older, and that such behaviour initially occurred in aquatic environments (few crustaceans are terrestrial).

"Recognizing and avoiding the dead could reduce the chances of catching the disease, or allow you to get away with just enough exposure to activate your immunity," says Rollo. Likewise, he adds, release of fatty acids from dismembered body parts could provide a strong warning that a nasty predator was nearby.

"As explained in our study, fatty acids—oleic or linoleic acids—are reliably and quickly released from the cells following death. Evolution appears to have favoured such clues because they were reliably associated with demise, and avoiding contagion and predation are rather critical to survival."

The generality and strength of the phenomenon, coupled with the fact that the fatty acids are essential nutrients rather than pesticides, holds real promise for applications such as plant and stored product protection or exclusion of household pests.

McMaster University

Thursday, September 24, 2009

GRAPHITIC MEMORY TECHNIQUES ADVANCE AT RICE

0 comentarios
Advances by the Rice University lab of James Tour have brought graphite’s potential as a mass data storage medium a step closer to reality and created the potential for reprogrammable gate arrays that could bring about a revolution in integrated circuit logic design.

In a paper published in the online journal ACS Nano, Tour and postdoctoral associate Alexander Sinitskii show how they've used industry-standard lithographic techniques to deposit 10-nanometer stripes of amorphous graphite, the carbon-based, semiconducting material commonly found in pencils, onto silicon. This facilitates the creation of potentially very dense, very stable nonvolatile memory for all kinds of digital devices.

With backing from a major manufacturer of memory chips, Tour and his team have pushed the technology forward in several ways since a paper that appeared last November first described two-terminal graphitic memory. While noting advances in other molecular computing techniques that involve nanotubes or quantum dots, he said none of those have yet proved practical in terms of fabrication.

Not so with this simple-to-deposit graphite. "We're using chemical vapor deposition and lithography -- techniques the industry understands," said Tour, Rice's Chao Professor of Chemistry and a professor of mechanical engineering and materials science and of computer science. "That makes this a good alternative to our previous carbon-coated nanocable devices, which perform well but are very difficult to manufacture."

Graphite makes a good, reliable memory "bit" for reasons that aren't yet fully understood. The lab found that running a current through a 10-atom-thick layer of graphite creates a complete break in the circuit -- literally, a gap in the strip a couple of nanometers wide. Another jolt repairs the break. The process appears to be indefinitely repeatable, which provides addressable ones and zeroes, just like today's flash memory devices but at a much denser scale.

Graphite's other advantages were detailed in Tour's earlier work: the ability to operate with as little as three volts, an astoundingly high on/off ratio (the amount of juice a circuit holds when it’s on, as opposed to off) and the need for only two terminals instead of three, which eliminates a lot of circuitry. It's also impervious to a wide temperature range and radiation; this makes it suitable for deployment in space and for military uses where exposure to temperature extremes and radiation is a concern.

Tour's graphite-forming technique is well-suited for other applications in the semiconductor industry. One result of the previous paper is a partnership between the Tour group and NuPGA (for "new programmable gate arrays"), a California company formed around the research to create a new breed of reprogrammable gate arrays that could make the design of all kinds of computer chips easier and cheaper.

The Tour lab and NuPGA, led by industry veteran Zvi Or-Bach (founder of eASIC and Chip Express), have applied for a patent based on vertical arrays of graphite embedded in "vias," the holes in integrated circuits connecting the different layers of circuitry. When current is applied to a graphite-filled via, the graphite alternately splits and repairs itself (a process also described in the latest paper), just like it does in strip form. Essentially, it becomes an "antifuse," the basic element of one type of field programmable gate array (FPGA), best described as a blank computer chip that uses software to rewire the hardware.

Currently, antifuse FPGAs can be programmed once. But this graphite approach could allow for the creation of FPGAs that can be reprogrammed at will. Or-Bach said graphite-based FPGAs would start out as blanks, with the graphite elements split. Programmers could "heal" the antifuses at will by applying a voltage, and split them with an even higher voltage.

Such a device would be mighty handy to computer-chip designers, who now spend many millions to create the photolithography mask sets used in chip fabrication. If the design fails, it's back to square one.

"As a result of that, people are only hesitantly investing in new chip designs," said Tour. "They stick with the old chip designs and make modifications. FPGAs are chips that have no specific ability, but you use software to program them by interconnecting the circuitry in different ways." That way, he said, fabricators don’t need expensive mask sets to try new designs.

"The No. 1 problem in the industry, and one that gives an opportunity for a company like ours, is that the cost of masks keeps moving up as people push semiconductors into future generators," said Or-Bach. "Over the last 10 years, the cost of a mask set has multiplied almost 10 times.

"If we can really make something that will be an order of magnitude better, the markets will be happy to make use of it. That's our challenge, and I believe the technology makes it possible for us to do that."

Rice University

SCIENTISTS DISCOVER HUNGER'S TIMEKEEPER

0 comentarios

Researchers at Columbia and Rockefeller Universities have identified cells in the stomach that regulate the release of a hormone associated with appetite. The group is the first to show that these cells, which release a hormone called ghrelin, are controlled by a circadian clock that is set by mealtime patterns. The finding, published in Proceedings of the National Academy of Sciences, has implications for the treatment of obesity and is a landmark in the decades-long search for the timekeepers of hunger.

The scientists, led by Rae Silver, head of Columbia’s Laboratory of Neurobiology and Behavior and Helene L. and Mark N. Kaplan Professor at Barnard College, showed that ghrelin’s release whets the appetite of mice, spurring them to actively search for and consume food, even when they are not hungry. In addition to Silver, the researchers involved in the study include Barnard College senior research scientist Joseph LeSauter and collaborator Donald Pfaff at The Rockefeller University.

“Circadian clocks allow animals to anticipate daily events rather than just react to them,” said LeSauter, who ran and supervised the study’s experiments. “The cells that produce ghrelin have circadian clocks that presumably synchronize the anticipation of food with metabolic cycles.”

According to previous studies, people given ghrelin injections feel voraciously hungry and eat more at a buffet than they otherwise would. The new research suggests that the stomach tells the brain when to eat and that establishing a regular schedule of meals will regulate the stomach’s release of ghrelin. “If you eat all the time, ghrelin secretion will not be well controlled,” said Silver, the paper’s lead author and the principal investigator of the study. “It’s a good thing to eat meals at a regularly scheduled time of day.”

The scientists show that stomach cells in mice release ghrelin into the general circulation before meal time. The hormone triggers a flurry of food seeking behavior associated with hunger and stimulates eating.

LeSauter studied genetically engineered mice lacking the ghrelin-recognizing receptor and compared them with normal mice on identical feeding schedules. He found that the mice lacking the ghrelin receptor began to forage for food much later and to a lesser extent than their normal counterparts.

Pfaff believes ghrelin, which travels from the stomach through the bloodstream to the brain, influences a decision-making process in brain cells. These brain cells are constantly deciding whether or not to eat, and as mealtime draws near, the presence of ghrelin increases the proportion of “yes” decisions.

The research underscores that ghrelin, the only known natural appetite stimulant made outside the brain, is a promising target for drug developers. Unlike drugs that focus on satiety, those that target ghrelin could help curb appetite before dieters take their first bite.

(Photo: Rae Silver, Joseph LeSauter and Donald Pfaff)

Columbia University

TORNADO THREAT INCREASES AS GULF HURRICANES GET LARGER

0 comentarios

Tornadoes that occur from hurricanes moving inland from the Gulf Coast are increasing in frequency, according to researchers at the Georgia Institute of Technology. This increase seems to reflect the increase in size and frequency among large hurricanes that make landfall from the Gulf of Mexico. The findings can be found in Geophysical Research Letters online and in print in the September 3, 2009 issue.

“As the size of landfalling hurricanes from the Gulf of Mexico increases, we’re seeing more tornadoes than we did in the past that can occur up to two days and several hundred miles inland from the landfall location,” said James Belanger, doctoral student in the School of Earth and Atmospheric Sciences at Georgia Tech and lead author of the paper.

Currently, it’s well known that when hurricanes hit land, there’s a risk that tornadoes may form in the area. Until now, no one has quantified that risk because observations of tornadoes were too sporadic prior to the installation of the NEXRAD Doppler Radar Network in 1995. Belanger along with co-authors Judith Curry, professor and chair of the School of Earth and Atmospheric Sciences at Tech and research scientist Carlos Hoyos, decided to see if they could create a model using the more reliable tornado record that’s existed since 1995.

The model that they developed for hurricane-induced tornadoes uses four factors that serve as good predictors of tornado activity: size, intensity, track direction and whether there’s a strong gradient of moisture at midlevels in the storm's environment.

“The size of a tropical cyclone basically sets the domain over which tornadoes can form. So a larger storm that has more exposure over land has a higher propensity for producing tornadoes than a smaller one, on average,” said Belanger.

The team looked at 127 tropical cyclones from 1948 up to the 2008 hurricane season and went further back to 1920 modifying their model to account for the type of data collected at that time. They found that since 1995 there has been a 35 percent percent increase in the size of tropical cyclones from the Gulf compared to the previous active period of storms from 1948-1964, which has lead to a doubling in the number of tornadoes produced per storm. The number of hurricane-induced tornadoes during the 2004 and 2005 hurricane seasons is unprecedented in the historical record since 1920, according to the model.

“The beauty of the model is that not only can we use it to reconstruct the observational record, but we can also use it as a forecasting tool,” said Belanger.

To test how well it predicted the number of tornadoes associated with a given hurricane, they input the intensity of the storm at landfall, it’s size, track and moisture at mid-levels, and were able to generate a forecast of how many tornadoes formed from the hurricane. They found that for Hurricane Ike in 2008, their model predicted exactly the number of tornadoes that occurred, 33. For Hurricane Katrina in 2005, the model predicted 56 tornadoes, and 58 were observed.

The team’s next steps are to take a look to see how hurricane size, not just intensity (as indicated by the Safir-Simpson scale), affects the damage experienced by residents.

“Storm surge, rain and flooding are all connected to the size of the storm,” said Curry. “Yet, size is an underappreciated factor associated with damage from hurricanes. So it’s important to develop a better understanding of what controls hurricane size and how size influences hurricane damage. The great damage in Galveston from Hurricane Ike in 2008 was inconsistent with Category 2 wind speeds at landfall, but it was the large size that caused the big storm surge that did most of the damage.”

(Photo: NOAA)

Georgia Institute of Technology

SCIENTISTS DISCOVER NEW GENETIC VARIATION THAT CONTRIBUTES TO DIABETES

0 comentarios

Scientists have identified a genetic variation in people with type 2 diabetes that affects how the body's muscle cells respond to the hormone insulin, in a new study published in Nature Genetics. The researchers, from Imperial College London and other international institutions, say the findings highlight a new target for scientists developing treatments for diabetes.

Previous studies have identified several genetic variations in people with type 2 diabetes that affect how insulin is produced in the pancreas. Today's study shows for the first time a genetic variation that seems to impair the ability of the body's muscle cells to use insulin to help them make energy.

People with type 2 diabetes can have problems with the body not producing enough insulin and with cells in the muscles, liver and fat becoming resistant to it. Without sufficient insulin, or if cells cannot use insulin properly, cells are unable to take glucose from the blood and turn it into energy. Until now, scientists had not been able to identify the genetic factors contributing to insulin resistance in type 2 diabetes.

In the new research, scientists from international institutions including Imperial College London, McGill University, Canada, CNRS, France, and the University of Copenhagen, Denmark, looked for genetic markers in over 14,000 people and identified four variations associated with type 2 diabetes. One of these was located near a gene called IRS1, which makes a protein that tells the cell to start taking in glucose from the blood when it is activated by insulin. The researchers believe that the variant they have identified interrupts this process, impairing the cells' ability to make energy from glucose. The researchers hope that scientists will be able to target this process to produce new treatments for type 2 diabetes.

Professor Philippe Froguel, one of the corresponding authors of today's study from the Department of Genomic Medicine at Imperial College London, said: "We are very excited about these results - this is the first genetic evidence that a defect in the way insulin works in muscles can contribute to diabetes. Muscle tissue needs to make more energy using glucose than other tissues. We think developing a treatment for diabetes that improves the way insulin works in the muscle could really help people with type 2 diabetes.

"It is now clear that several drugs should be used together to control this disease. Our new study provides scientists developing treatments with a straightforward target for a new drug to treat type 2 diabetes," added Professor Froguel.

The researchers carried out a multistage association study to identify the new gene. First, they looked at genome-wide association data from 1,376 French individuals and identified 16,360 single-nucleotide polymorphisms (SNPs), or genetic variations, associated with type 2 diabetes. The researchers then studied these variations in 4,977 French individuals.

Next, the team selected the 28 most strongly associated SNPs and looked for them in 7,698 Danish individuals. Finally, the researchers identified four SNPs strongly associated with type 2 diabetes. The most significant of these variations was located near the insulin receptor substrate 1, or IRS1, gene.

To test their findings, the team analysed biopsies of skeletal muscle from Danish twins, one of whom had type 2 diabetes. They found that the twin with diabetes had the variation near IRS1 and this variation resulted in insulin resistance in the muscle. They also noted that the variation affected the amount of protein produced by the gene IRS1, suggesting that the SNP controls the IRS1 gene.

(Photo: ICL)

Imperial College London

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com