Thursday, August 26, 2010


0 comentarios

“Love stinks!” the J. Geils Band told the world in 1980, and while you can certainly argue whether or not this tender and ineffable spirit of affection has a downside, working hard to find it does. It may even shorten your life.

A new study shows that ratios between males and females affect human longevity. Men who reach sexual maturity in a context in which they far outnumber women live, on average, three months less than men whose competition for a mate isn’t as stiff. The steeper the gender ratio (also known as the operational sex ratio), the sharper the decline in life span.

“At first blush, a quarter of a year may not seem like much, but it is comparable to the effects of, say, taking a daily aspirin, or engaging in moderate exercise,” says Nicholas Christakis, senior author on the study and professor of medicine and medical sociology at Harvard Medical School as well as professor of sociology at Harvard University’s Faculty of Arts and Sciences. “A 65-year-old man is typically expected to live another 15.4 years. Removing three months from this block of time is significant.”
These results are published in the August issue of the journal Demography.

An association between gender ratios and longevity had been established through studies of animals before, but never in humans. To search for a link in people, Christakis collaborated with researchers from the Chinese University of Hong Kong, the University of Wisconsin, and Northwestern University. The researchers looked at two distinct datasets.

First, they examined information from the Wisconsin Longitudinal Study, a long-term project involving individuals who graduated from Wisconsin high schools in 1957. The researchers calculated the gender ratios of each high school graduating class, then ascertained how long the graduates went on to live. After adjusting for a multitude of factors, they discovered that, 50 years later, men from classes with an excess of boys did not live as long as men whose classes were gender-balanced. By one measurement, mortality for a 65-year-old who had experienced a steeper sex ratio decades earlier as a teenager was 1.6 percent higher than one who hadn’t faced such stiff competition for female attention.

Next, the research team compared Medicare claims data with census data for a complete national sample of more than 7 million men throughout the United

States and arrived at similar results (for technical reasons, the study was unable to evaluate results for women who outnumbered men at sexual maturity).

Much attention has been paid to the deleterious social effects of gender imbalances in countries such as China and India, where selective abortion, internal migration, and other factors have in some areas resulted in men outnumbering women by up to 20 percent. Such an environment, already associated with a marked increase in violence and human trafficking, appears to shorten life as well.

The researchers have not investigated mechanisms that might account for this phenomenon, but Christakis suspects that it arises from a combination of social and biological factors. After all, finding a mate can be stressful, and stress as a contributor to health disorders has been well-documented.

Says Christakis, “We literally come to embody the social world around us, and what could be more social than the dynamics of sexual competition?”

(Photo: Kristyn Ulanday/Harvard Staff Photographer)

Harvard University


0 comentarios
A team led by Boston University College of Arts & Sciences Professor of Psychology Michael Hasselmo has won a $1.5 million grant from the Department of Defense's Office of Naval Research to study rat brains to learn how to help military robots navigate.

The team will develop biologically inspired algorithms for robotic navigation based on recent data on grid cells recorded in the entorhinal cortex of the rat. In contrast to robots, rodents are highly effective at exploring an environment and returning to rewarding locations. This behavior may depend on neural activity selective to location, including the activity of recently discovered grid cells in the entorhinal cortex.

An active area of robotics research concerns the ability of a robot to perform navigation toward selected goals in the environment, and the capacity for a human operator to communicate with a robot about locations and goals. This includes the requirement of a robot to learn a representation of the environment during exploration while accurately recognizing location, termed simultaneous localization and mapping (SLAM).

Grid cells are neurons recorded as a rat explores an environment. The cells respond in an array of locations that can be described as the vertices of tightly packed equilateral triangles, or as a hexagonal grid. Recent models have shown how grid cells can code location based on self-motion information provided by neurons that code head direction or running speed, and have shown how grid cells could arise from oscillations in the entorhinal cortex. Recent imaging data indicates that grid cells may exist in the human cortex.

The researchers on this grant will further develop the models based on biological data and use them to guide the development of algorithms for robotic navigation, and for communication of information about spatial location between human operators and robots.

Boston University


0 comentarios

Human impact is causing lower oxygen and higher carbon dioxide levels in coastal water bodies. Increased levels of carbon dioxide cause the water to become more acidic, having dramatic effects on the lifestyles of the wildlife that call these regions home. The problems are expected to worsen if steps aren’t taken to reduce greenhouse emissions and minimize nutrient-rich run-off from developed areas along our coastlines.

The ocean is filled with a soup of bacteria and viruses. The animals living in these environments are constantly under assault by pathogens and need to be able to mount an immune response to protect themselves from infection, especially if they have an injury or wound that is openly exposed to the water.

Louis Burnett, professor of biology and director of the Grice Marine Laboratory of the College of Charleston, and Karen Burnett, research associate professor at Grice Marine Laboratory of the College of Charleston, study the effects of low oxygen and high carbon dioxide on organisms’ immune systems. They have found that organisms in these conditions can’t fight off infections as well as animals living in oxygen rich, low carbon dioxide environments.

The researchers examined fish, oysters, crabs and shrimp, and showed that all these animals have a decreased ability to fight off infection of Vibrio bacteria when subjected to low oxygen, high carbon dioxide conditions. It takes about half as much bacteria to administer a lethal dose to a creature in a low oxygen, high carbon dioxide environment.

“Our approach is exciting because traditionally physiologists haven’t considered bacteria or disease as a natural environmental barrier, so it’s a pretty open field,” says Louis Burnett.

Apparently, if marine animals are challenged with a pathogen, a large number of their blood cells disappear within a few minutes. The blood cells clump up to attack the pathogen, but also lodge in the gills (the sea critter version of lungs), where the body gets it oxygen. The scientists see evidence that sea animals fighting off infection lower their metabolism, which slows down other important processes like making new proteins.

“Everything we see points to the fact that if an animal that mounts a successful immune response then their gill function and ability to exchange oxygen is reduced by about 40 percent, which is why they seem to be having such problems living in low oxygen conditions,” says Karen Burnett. “If you add high carbon dioxide to that, it gets worse.”

The researchers are now using microarrays to measure changes in gene expression in marine organisms that are exposed to bacteria under low oxygen, high carbon dioxide conditions.

“After exposure to these conditions for only a day, animals at the molecular level have given up in trying to adapt to the situation, and they are going into molecular pathways that indicate cell death,” says Karen Burnett.

The coastal animals the Burnett’s study live in environments where natural levels of oxygen and carbon dioxide fluctuate. Theoretically, these animals are already adapted for varied environments, and yet they still struggle with these changing conditions. It’s alarming that deep-water animals may be much more affected by ocean acidification, since they are not used to the ebb and flow of oxygen and carbon dioxide levels.

“Some of the models for how the coastal organisms adapt may help researchers predict how deep water organisms are going to be affected by overall climate change too,” says Louis Burnett.

(Photo: Louis and Karen Burnett)

The American Physiological Society


0 comentarios

That age-old question, "where did life on Earth start?" now has a new answer. If the life between the mica sheets hypothesis is correct, life would have originated between sheets of mica that were layered like the pages in a book.

The so-called "life between the sheets" mica hypothesis was developed by Helen Hansma of the University of California, Santa Barbara, with funding from the National Science Foundation (NSF). This hypothesis was originally introduced by Hansma at the 2007 annual meeting of the American Society for Cell Biology, and is now fully described by Hansma in the September 7, 2010 issue of Journal of Theoretical Biology.

According to the "life between the sheets" mica hypothesis, structured compartments that commonly form between layers of mica--a common mineral that cleaves into smooth sheets--may have sheltered molecules that were the progenitors to cells. Provided with the right physical and chemical environment in the structured compartments to survive and evolve, the molecules eventually reorganized into cells, while still sheltered between mica sheets.

Mica chunks embedded in rocks could have provided the right physical and chemical environment for pre-life molecules and developing cells because:

1. Mica compartments could have held, protected and sheltered molecules, and thereby promoted their survival. Also, mica could have provided enough isolation for molecules to evolve without being disturbed and still allow molecules to migrate towards one another and eventually bond together to form large organic molecules. And mica compartments may have provided something akin to a template for the production of a life form composed of compartments, which are now known as cells.

2. Mica sheets are held together by potassium. If high levels of potassium were donated by mica sheets to developing cells, the high levels of potassium found in mica sheets could account for the high levels of potassium currently found in human cells.
3. Mica chunks embedded in rocks that were sitting in an early ocean would have received an endless supply of energy from waves, the sun, and the occasional sloshing of water into the spaces between the mica sheets. This energy could have pushed the mica sheets into up-and-down motions that could have pushed together molecules sitting between mica sheets, thereby enabling them to bond together.

Because mica surfaces are hospitable to living cells and to all the major classes of large biological molecules, including proteins, nucleic acids, carbohydrates and fats, the "between the sheets" mica hypothesis is consistent with other well-known hypotheses that propose that life originated as RNA, fatty vesicles or primitive metabolisms. Hansma says a "mica world" might have sheltered all the ancient metabolic and fat-vesicle and RNA "worlds."

Hansma also says that mica would provide a better substrate for developing cells than other minerals that have been considered for that role. Why? Because most other minerals would probably have tended to intermittently become either too wet or too dry to support life. By contrast, the spaces between mica sheets would probably have undergone more limited wet/dry cycles that would support life without reaching killing extremes. In addition, many clays that have been considered as potential surfaces for life's origins respond to exposure to water by swelling. By contrast, mica resists swelling and would therefore provide a relatively stable environment for developing cells and biological molecules, even when it did get wet.

Hansma sums up her hypothesis by observing that "mica would provide enough structure and shelter for molecules to evolve but also accommodate the dynamic, ever-changing nature of life."

What's more, Hansma says that "mica is old." Some micas are estimated to be over 4 billion years old. And micas such as biotite have been found in regions containing evidence of the earliest life-forms, which are believed to have existed about 3.8 million years ago.

Hansma's passion for mica evolved gradually--starting when she began conducting pioneering, NSF-funded research in former husband Paul K. Hansma's AFM lab to develop techniques for imaging DNA and other biological molecules in the atomic force microscope (AFM)--a high-resolution imaging technique that allows researchers to observe and manipulate molecular and atomic level features.

Says Helen Hansma, "Mica sheets are atomically flat, so we can see DNA molecules on the mica surface without having to cover the DNA with something that makes it look bigger and easier to see. Sometimes we can even see DNA molecules swimming on the surface of mica, under water, in the AFM. Mica sheets are so thin (one nanometer) that there are a million of them in a millimeter-thick piece of mica."

Hansma's "life between the sheets" hypothesis first struck her a few years ago, after she and family members had collected some mica from a Connecticut mine. When she put water on a piece of the mica under her dissecting microscope, she noticed a greenish organic 'crud' at some step edges in the mica. "It occurred to me that this might be a good place for the origins of life--sheltered within these stacks of sheets that can move up and down in response to flowing water, which could have provided the mechanical energy for making and breaking chemical bonds," says Hansma.

Hansma says that recent advancements in imaging techniques, including the AFM, made possible her recent research, leading to her "between mica sheets" hypothesis. She adds that direct support for her hypothesis might be obtained from additional studies involving mica sheets in an AFM, being subjected its push-and-pull forces while sitting in liquids resembling an early ocean.

(Photo: Helen Greenwood Hansma, University of California, Santa Barbara)

National Science Foundation


0 comentarios

Most infectious diseases infect multiple host species, but to date, efforts to quantify the frequency and outcome of cross-species transmission (CST) of these diseases have been severely limited.

This lack of information represents a major gap in knowledge of how diseases emerge, and from which species they will emerge.

A paper published in the journal Science by a team of researchers led by Daniel Streicker of the University of Georgia has begun to close that gap.

Results of a study, conducted by Streicker and co-authors from the U.S. Centers for Disease Control, the University of Tennessee-Knoxville, and Western Michigan University, provide some of the first estimates for any infectious disease of how often CST happens in complex, multi-host communities--and the likelihood of disease in a new host species.

"Some of the deadliest human diseases, including AIDS and malaria, arose in other species and then jumped to humans," said Sam Scheiner of the National Science Foundation (NSF)'s Division of Environmental Biology, which co-funded the research with NSF's Directorate for Geosciences through the joint NIH-NSF Ecology of Infectious Diseases Program.

"Understanding that process," said Scheiner, "is key to predicting and preventing the next big outbreak."

Rabies is an ideal system to answer these questions, believes Streicker.

The disease occurs across the country, affects many different host species, and is known to mutate frequently. Although cases of rabies in humans are rare in the U.S., bats are the most common source of these infections.

To determine the rate of CST, and what outcomes those transmissions had, Streicker and his colleagues used a large dataset, unprecedented in its scope, containing hundreds of rabies viruses from 23 North American bat species.

They sequenced the nucleoprotein gene of each virus sample and used tools from population genetics to quantify how many CST events were expected to occur from any infected individual.

Their analysis showed that, depending on the species involved, a single infected bat may infect between 0 and 1.9 members of a different species; and that, on average, CST occurs only once for every 72.8 transmissions within the same species.

"What's really important is that molecular sequence data, an increasingly cheap and available resource, can be used to quantify CST," said Streicker.

Scientist Sonia Altizer of UGA agrees.

"This is a breakthrough," said Altizer. "The team defined, for the first time, a framework for quantifying the rates of CST across a network of host species that could be applied to other wildlife pathogens, and they developed novel methods to do it."

The researchers also looked at the factors that could determine the frequency of CST, using extensive data about each bat species, such as foraging behavior, geographic range and genetics.

"There's a popular idea that because of their potential for rapid evolution, the emergence of these types of viruses is limited more by ecological constraints than by genetic similarity between donor and recipient hosts," said Streicker. "We wanted to see if that was the case."

He found, instead, that rabies viruses are much more likely to jump between closely related bat species than between ones that diverged in the distant past.

Overlapping geographic range was also associated with CST, but to a lesser extent.

"CST and viral establishment do not occur at random, but instead are highly constrained by host-associated barriers," Streicker said. "Contrary to popular belief, rapid evolution of the virus isn't enough to overcome the genetic differences between hosts."

Streicker believes that what he and colleagues have learned about bat rabies will be influential in understanding the ecology, evolution and emergence of many wildlife viruses of public health and conservation importance.

"The basic knowledge we've gained will be key to developing new intervention strategies for diseases that can jump from wildlife to humans."

Streicker is continuing his work with rabies and bats with funding for a three-year study from NSF.

He and Altizer, in collaboration with investigators at the CDC, University of Michigan and the Peruvian Ministries of Health and Agriculture, will explore how human activities affect the transmission of the rabies virus in vampire bats in Peru--and how those changes might alter the risk of rabies infection for humans, domesticated animals, and wildlife.

(Photo: Ivan Kuzmin)

National Science Foundation


0 comentarios
If the eyes are the window to the soul, psychologists hoping to solve the mystery of why our neural impulses do not always trigger an immediate response, could find the answer in the flick of the eye.

The reasons why the speed of human responses to a given event can be so variable - even in laboratory controlled conditions where determining factors such as alertness and vigilance are a constant - remain a mystery to scientists studying the connection between our brains and behaviour.

For example, when driving, the onset of a red traffic light does not always lead to as rapid a response as one might expect from a driver when pressing the brake.

Now BBSRC-funded researchers from the University of Bristol are to provide a comprehensive account of response time variability, in a bid to explain at a functional level where the variability originates and what neural processes and anatomical structures are involved.

Professor Iain Gilchrist from Bristol is a cognitive neuroscientist who has just been awarded a BBSRC Research Development fellowship to explore the neural basis of response time variability.

"In my research group we focus on the study of the eye movement response," says Prof Gilchrist, also Director of Bristol Neuroscience. "Eye movements are interesting for a number of reasons. First, they are important for visual perception: we only see fine detailed information when the eyes point directly at a location. Second, eye movements are ubiquitous: we make more eye movements in a day than heart beats. Third, we have a detailed knowledge of the neurophysiology of eye movements control which allows us to link functional and neural explanations."

For the last ten years, researchers at Bristol University have focused on identifying functional explanations for eye movement responses using mathematical models and behavioural experiments.

The next major step for this research will be to investigate the brain processes that account for this variability. Functional Magnetic Radiation Imaging (fMRI) and Magnetoencephalography (MEG) are both methods for imaging the human brain while participants carry out a task. This allows the brain areas involved in the task to be identified.

The fellowship will allow Prof Gilchrist and his team to introduce these methods to his research to study the neural basis of response time variability. The fMRI work will also be one of the first projects to be carried out at the new Clinical Research and Imaging Centre, a pioneering collaboration between the University of Bristol and University Hospitals Bristol NHS Foundation Trust providing a £6.4M facility focused on world-class translational research across the scientific disciplines.

The behavioural experiments will be simple; human participants will be asked to move their eyes as quickly as possible to look directly at suddenly appearing visual targets. The movements of their eyes will be measured to determine when they responded. At the same time brain activity will be recorded to determine the time course and pattern of changes in the brain that determine response time variability.

Biotechnology and Biological Sciences Research Council


0 comentarios

Sure, dogs are special. You might not be aware, however, that studying their genomes can lead to advances in human health. So next time you gaze soulfully into a dog’s eyes or scratch behind its ears, take note of the length of his nose or the size of his body. Although such attributes can vary wildly among different breeds, a team of investigators co-led by researchers at Stanford University School of Medicine, Cornell University and the National Human Genome Research Institute have found that they are determined by only a few genetic regions.

The discovery shows how studying genetic differences among dog breeds may ultimately help us understand human biomedical traits, such as height, hair color and body weight that are usually influenced by the net impact of hundreds of different genes in our species. The key idea is that identifying the dozen regions where dogs harbor genetic switches among breeds will provide critical clues as to where researchers could find mutations important to human health and disease.

The study describes the most comprehensive genetic analysis of dogs to date, in which the researchers genotyped more than 900 individual dogs and assessed nearly 60 specific physical traits, and found that only a few genetic regions determine much of a dog’s appearance.

“We’ve found that only six or seven locations in the dog genome are necessary to explain about 80 percent of the differences in height and weight among dog breeds,” said Carlos Bustamante, PhD, professor of genetics at Stanford. “In humans these are controlled by hundreds if not thousands of variants.”

The research is published in the Aug. 10 Public Library of Science-Biology. Bustamante is a co-senior author of the study; Stanford research associate Adam Boyko, PhD, is one of three co-first authors. Elaine Ostrander, PhD, chief of the Cancer Genetics Branch of the National Human Genome Research Institute is the other senior author. Bustamante and Boyko began the work while they were at Cornell.

The work is a product of an intensive collaboration called the CanMap project, which involves several groups around the country including NHGRI, Cornell, the University of California-Los Angeles and now Stanford. The CanMap groups are using the dog as a model system to identify genomic regions responsible for many key physical characteristics. Although a few individual relationships, including an association between small body size and a gene called IGF-1, have been previously reported by the groups, many others were identified for the first time in this new analysis.

Dogs have been our companions and protectors for thousands of years. During this time, dogs adapted to living near human settlements largely through natural selection for being able to survive among people. But recently we humans decided to take things into our own hands. Driven sometimes by a love of novelty and other times by usefulness, our relentless breeding campaigns have left us with the Great Dane and the Chihuahua, the collie and the bulldog, and many more. As a result of our meddling, the dog is now the physically most diverse land animal.

“This dizzying array of morphological variants has happened extraordinarily quickly in terms of evolutionary timescales, due to extraordinarily strong selection by humans,” said Bustamante. “Most dog breeds are only a couple of hundred years old.”

All told, there are about 57 phenotypic traits that were used to visually differentiate one breed from another, including body size, snout length and ear type. The CanMap project set out to identify what regions of the dog genome contributed to each of these different traits. They didn’t know whether the differences in appearance from breed to breed resulted from many genetic mutations, each of which makes a small contribution to a dog’s appearance, or if they were due to only a few, powerful changes.

To answer the question, the NHGRI team genotyped more than 60,000 single genetic changes called SNPs (for single nucleotide polymorphisms) in 915 dogs. The dogs included representatives of 80 domestic breeds, 83 wild canids such as wolves, foxes and coyotes, and 10 Egyptian village dogs — domesticated but of no particular breed.

The CanMap researchers used the SNPs to identify chunks of DNA shared among individual dogs of the same breed. They found that while purebred dogs tended to share large stretches of DNA with other members of their breed, the wild dogs and village mongrels were more variable. They then looked to see which regions varied with specific physical traits from breed to breed.

The researchers found that — in contrast to humans — many physical traits in dogs are determined by very few genetic regions. For example, a dog with version A of the “snout length” region may have a long, slender muzzle, while version B confers a more standard nose and C an abnormally short schnoz. And let’s say X, Y and Z in the “leg length” region bestow a range of heights from short to tall. That would mean that in this example an A/X dog would have a slender muzzle and short legs like a dachshund. C/Y might be a bulldog, while B/Z would be more like a Labrador. This mixing and matching of chunks of DNA is how breeders were able to come up with so many different breeds in a relatively short amount of time.

Determining the differences between dog breeds may seem inconsequential, but it has important implications for human health.

“Understanding the genetic bases of complex traits in humans is difficult because many different genes can influence a particular trait,” explained Bustamante. “Having model systems, such as mice and dogs, is critical for making sense of the biology. For example, one of the strongest associations in human genetics is between a common variant in a gene called HMGA2 and height. In our study, we also see a strong association with body size and HMGA2 (just as we see at IGF-1 in humans, mice and dogs and body-size variation within each species). This suggests that studying what underlies the HMGA2 association in dogs could help us understand the relationship in humans. In this way, dogs are a fantastic model system since they complement mouse and human genetics.”

In the future, the researchers plan to investigate whether dog behavioral traits can be linked to specific genomic regions, and how these regions may be important in mammalian behavior.

(Photo: Stanford U.)

Stanford University


0 comentarios
Under the microscope, the bacteria start dividing normally, two cells become four and then eight and so on. But then individual cells begin "popping," like circus balloons being struck by darts.

This phenomenon, which surprised the Duke University bioengineers who captured it on video, turns out to be an example of a more generalized occurrence that must be considered by scientists creating living, synthetic circuits out of bacteria. Even when given the same orders, no two cells will behave the same.

The researchers believe this accidental finding of a circuit they call "ePop" can help increase the efficiency and power of future synthetic biology circuits.

Synthetic circuits are created by genetically altering colonies of bacteria to produce a myriad of useful proteins, enzymes or chemicals in a coordinated way. The circuits can even be reprogrammed to deliver different types of drugs or to selectively kill cancer cells. Scientists in this emerging field of synthetic biology have operated under the assumption that when identical snippets of engineered DNA - known as plasmids -- are inserted into cells, each cell will respond in the same way.

"In the past, synthetic biologists have often assumed that the components of the circuit would act in a predictable fashion every time and that the cells carrying the circuit would just serve as a passive reactor," said Lingchong You, an assistant professor of biomedical engineering and member of Duke's Institute for Genome Sciences & Policy. "In essence, they have taken a circuit-centric view for the design and optimization process. This notion is helpful in making the design process more convenient."

But the cells in this study unexpectedly began popping when the colony reached a certain density of cells because of an unintended consequence of introducing plasmids.

Biochemistry graduate student Philippe Marguet said the research team looked at many factors to try to explain how the bacteria sensed the size of their colonies. "In the end, it turns out that the (number of copies of) plasmid increases with cell density. This is the critical link that enables the cells to sense their density and to commit suicide at sufficiently high densities."

"We ran computer models and experiments to show that this is indeed the case," Marguet said. "Our results underscore the importance of the amount of plasmids and the potential impact of hidden interactions on the behavior of engineered gene circuits."

The results of the team's experiments were published online Aug. 9 in the journal PLoS One.

Researchers can reprogram populations of genetically altered bacteria to direct their actions in much the same way that a computer program directs a computer. In this analogy, the plasmids are the software, the cell the computer. One of these plasmids tells cells to commit suicide if the number of cells in a population gets too high.

However, in the ePop circuit, which made use of the common Escherichia coli (E. coli) bacteria, the cell death, or popping, took place without the suicide gene. The researchers believe that when the plasmid is inserted into bacteria, it can be expressed at different levels in different cells. When over-expressed in a particular cell, it leads to the cell's demise. When enough of the cells are so affected, the population of cells in the colony decreases.

"Perhaps the confluence of the conditions for significant plasmid amplification was not seen in previous experiments," You said. "In this regard, ePop can be valuable as a probe of cell physiology to find out what environmental and genetic conditions lead to this amplification. As a probe, ePop has the advantage of being easily observable and highly sensitive and it has the ability to provide new information on complex interactions between the plasmid and the host cell."

The goal, You said, is to get to the point where scientists have a complete understanding of each component of a circuit, so that when a new plasmid is added, all of its effects can be observed.

Duke University


0 comentarios

What if trains, planes and automobiles all were powered simply by the air through which they move? What if their exhaust and by-products helped the environment?

Such an energy-efficient, self-propelling mechanism already exists in nature.

The salp, a small, barrel-shaped organism that resembles a streamlined jellyfish, gets everything it needs from ocean waters to feed and propel itself.

Scientists believe its waste material may help remove carbon dioxide (CO2) from the upper ocean and the atmosphere.

Now researchers at the Woods Hole Oceanographic Institution (WHOI) and MIT have found that the half-inch to 5-inch-long creatures are even more efficient than had been believed.

"This innovative research is providing an understanding of how a key organism in marine food webs affects important biogeochemical processes," said David Garrison, director of the National Science Foundation (NSF)'s biological oceanography program, which funded the research.

Reporting in the journal Proceedings of the National Academy of Sciences (PNAS), the scientists have found that mid-ocean-dwelling salps are capable of capturing and eating extremely small organisms as well as larger ones, rendering them even hardier--and perhaps more plentiful--than had been believed.

"We had long thought that salps were about the most efficient filter-feeders in the ocean," said Larry Madin, WHOI Director of Research and one of the paper's authors.

"But these results extend their impact down to the smallest available size fraction, showing they consume particles spanning four orders of magnitude in size. This is like eating everything from a mouse to a horse."

Salps capture food particles, mostly phytoplankton, with an internal mucus filter net. Until now, it was thought that included only particles larger than the 1.5-micron-wide holes in the mesh; smaller particles would slip through.

But a mathematical model suggested salps somehow might be capturing food particles smaller than that, said Kelly Sutherland, who co-authored the PNAS paper after her PhD research at MIT and WHOI.

In the laboratory at WHOI, Sutherland and her colleagues offered salps food particles of three sizes: smaller, around the same size as, and larger than the mesh openings.

"We found that more small particles were captured than expected," said Sutherland, now a post-doctoral researcher at Caltech. "When exposed to ocean-like particle concentrations, 80 percent of the particles that were captured were the smallest particles offered in the experiment."

The finding helps explain how salps--which can exist either singly or in "chains" that may contain a hundred or more--are able to survive in the open ocean where the supply of larger food particles is low.

"Their ability to filter the smallest particles may allow them to survive where other grazers can't," said Madin.

Perhaps most significantly, the result enhances the importance of the salps' role in carbon cycling. As they eat small, as well as large, particles, "they consume the entire 'microbial loop' and pack it into large, dense fecal pellets," Madin says.

The larger and denser the carbon-containing pellets, the sooner they sink to the ocean bottom. "This removes carbon from the surface waters," said Sutherland, "and brings it to a depth where you won't see it again for years to centuries."

And the more carbon that sinks to the bottom, the more space there is for the upper ocean to accumulate carbon, hence limiting the amount that rises into the atmosphere as CO2, said paper co-author Roman Stocker of MIT.

"The most important aspect of this work is the very effective shortcut that salps introduce in the process of particle aggregation," Stocker said. "Typically, aggregation of particles proceeds slowly, by steps, from tiny particles coagulating into slightly larger ones."

"Now, the efficient foraging of salps on particles as small as a fraction of a micrometer introduces a substantial shortcut in this process, since digestion and excretion package these tiny particles into much larger particles, which thus sink a lot faster."

This process starts with the mesh made of fine mucus fibers inside the salp's hollow body.

Salps, which can live for weeks or months, swim and eat in rhythmic pulses, each of which draws seawater in through an opening at the front end of the animal. The mesh captures the food particles, then rolls into a strand and goes into the gut, where it is digested.

"It was assumed that very small cells or particles were eaten mainly by other microscopic consumers, like protozoans, or by a few specialized metazoan grazers like appendicularians," said Madin.

"This research indicates that salps can eat much smaller organisms, like bacteria and the smallest phytoplankton, organisms that are numerous and widely distributed in the ocean."

The work, also funded by the WHOI Ocean Life Institute, "implies that salps are more efficient vacuum cleaners than we thought," said Stocker.

"Their amazing performance relies on a feat of bioengineering--the production of a nanometer-scale mucus net--the biomechanics of which remain a mystery."

(Photo: Kelly Sutherland and Larry Madin, WHOI)

National Science Foundation


0 comentarios

By 2100 only 18% to 45% of the plants and animals making up ecosystems in global, humid tropical forests may remain as we know them today, according to a new study led by Greg Asner at the Carnegie Institution’s Department of Global Ecology.

The research combined new deforestation and selective logging data with climate-change projections. It is the first study to consider these combined effects for all humid tropical forest ecosystems and can help conservationists pinpoint where their efforts will be most effective. The study is published in the August 5, 2010, issue of Conservation Letters.

“This is the first global compilation of projected ecosystem impacts for humid tropical forests affected by these combined forces,” remarked Asner. “For those areas of the globe projected to suffer most from climate change, land managers could focus their efforts on reducing the pressure from deforestation, thereby helping species adjust to climate change, or enhancing their ability to move in time to keep pace with it. On the flip side, regions of the world where deforestation is projected to have fewer effects from climate change could be targeted for restoration.”

Tropical forests hold more than half of all the plants and animal species on Earth. But the combined effect of climate change, forest clear cutting, and logging may force them to adapt, move, or die. The scientists looked at land use and climate change by integrating global deforestation and logging maps from satellite imagery and high-resolution data with projected future vegetation changes from 16 different global climate models. They then ran scenarios on how different types of species could be geographically reshuffled by 2100.They used the reorganization of plant classes, such as tropical broadleaf evergreen trees, tropical drought deciduous trees, plus different kinds of grasses as surrogates for biodiversity changes.

For Central and South America, climate change could alter about two-thirds of the humid tropical forests biodiversity—the variety and abundance of plants and animals in an ecosystem. Combining that scenario with current patterns of land-use change, and the Amazon Basin alone could see changes in biodiversity over 80% of the region.

Most of the changes in the Congo area likely to come from selective logging and climate change, which could negatively affect between 35% and 74% of that region. At the continental scale, about 70% of Africa’s tropical forest biodiversity would likely be affected if current practices are not curtailed.

In Asia and the central and southern Pacific islands, deforestation and logging are the primary drivers of ecosystem changes. Model projections suggest that climate change might play a lesser role there than in Latin America or Africa. That said, the research showed that between 60% and 77% of the area is susceptible to biodiversity losses via massive ongoing land-use changes in the region.

“This study is the strongest evidence yet that the world’s natural ecosystems will undergo profound changes—including severe alterations in their species composition—through the combined influence of climate change and land use,” remarked Daniel Nepstad, senior scientist at the Woods Hole Research Center. “Conservation of the world’s biota, as we know it, will depend upon rapid, steep declines in greenhouse gas emissions.”

(Photo: Carnegie I.)

Carnegie Institution


0 comentarios

For decades, physicists have been trying to reconcile the two major theories that describe physical behavior. The first, Einstein’s theory of general relativity, uses gravity — forces of attraction — to explain the behavior of objects with large masses, such as falling trees or orbiting planets. However, at the atomic and subatomic level, particles with negligible masses are better described using another theory: quantum mechanics.

A “theory of everything” that marries general relativity and quantum mechanics would encompass all physical interactions, no matter the size of the object. One of the most popular candidates for a unified theory is string theory, first developed in the late 1960s and early 1970s.

String theory holds that electrons and quarks (the building blocks of larger particles) are one-dimensional oscillating strings, not the dimensionless objects they are traditionally thought to be.

Physicists are divided on whether string theory is a viable theory of everything, but many agree that it offers a new way to look at physical phenomena that have otherwise proven difficult to describe. In the past decade, physicists have used string theory to build a connection between quantum and gravitational mechanics, known as gauge/gravity duality.

MIT physicists, led by Hong Liu and John McGreevy, have now used that connection to describe a specific physical phenomenon — the behavior of a type of high-temperature superconductor, or a material that conducts electricity with no resistance. The research, published in the Aug. 5 online edition of Science, is one of the first to show that gauge/gravity duality can shed light on a material’s puzzling physical behavior.

So far, the team has described a few aspects of behavior of a type of superconducting materials called cuprates. However, the researchers hope their work could lead to more general theories to describe other materials, and eventually predict their behavior. “That’s the ultimate theoretical goal, and we haven’t really achieved that,” says Liu.

MIT graduate student Nabil Iqbal and recent PhD recipients Thomas Faulkner and David Vegh are also authors of the paper.

In 1986, physicists discovered that cuprates (ceramic compounds that contain copper) can superconduct at relatively high temperatures (up to 135 degrees Celsius above absolute zero).

At the atomic level, cuprates are classified as a “many-body system” — essentially a vast collection of electrons that interact with each other. Such systems are usually described using quantum mechanics. However, so far, physicists have found it difficult to describe cuprates, because their behavior is so different from other materials. Understanding that behavior could help physicists find new materials that superconduct at even higher temperatures. These new materials would have potentially limitless applications.

Unlike most materials, cuprates do not obey Fermi’s laws, a set of quantum-mechanics principles that govern microscopic behavior at very low temperatures (close to absolute zero, or -273 degrees Celsius). Instead, cuprates become superconductors. Just above the temperature at which they begin to superconduct, they enter a state called the “strange metal” state.

In this study, the researchers focused on two properties that distinguish those cuprate strange metals from Fermi liquids. In ordinary Fermi liquids, electrical resistivity and the rates of electron scattering (deflection from their original course caused by interactions with each other) are both proportional to the temperature squared. However, in cuprates (and other superconducting non-Fermi liquids), electron scattering and resistivity are proportional to the temperature. “There’s really no theory of how to explain that,” says Liu.

Using gauge/gravity duality — the connection between quantum and gravitational mechanics — the MIT team identified a system that has the same unusual properties as strange metals, but could be explained by gravitational mechanics. In this case, the model they used was a gravitational system with a black hole. “It’s a mathematical abstraction which we hope may shed light on the physics of the real system,” says Liu. In their model, they can study behavior at high and low energy (determined by how the excitation energy of a single electron compares to the average energy of an electron in the system), and it turns out that at low energy, the black-hole model exhibits many of the same unusual traits seen in non-Fermi liquids such as cuprates.

For example, in both systems, when an electron at the lowest possible energy level is excited (by a photon or another particle), the resulting interaction between the electron and the hole left behind cannot be described as a quasiparticle (as it can in ordinary metals), because the electron excitation decays so quickly. (The electrons decay so quickly because their scattering rate is proportional to the temperature.) Furthermore, the electrical resistance of the black-hole system is directly proportional to temperature — just as it is in cuprates.

Gauge/gravity duality offers a “map” that correlates certain features of the black-hole model to corresponding features of strange metals. Therefore, once the physicists calculated the features of the model, using general relativity, those values could be translated to the corresponding values in the strange-metal system. For example, the value of an electromagnetic field in the gravitational system could correspond to the density of electrons in the quantum system.

Physicists have previously used gauge/gravity duality to describe some characteristics of quark gluon plasma, the “hot soup” of elementary particles that existed in the first millionths of a second after the Big Bang. However, this is the first time it has been used to give insight into a type of condensed matter (solids and liquids are condensed matter).

For that reason, the paper should have a significant impact in theoretical physics, says Joseph Polchinski, a theoretical physicist at the University of California at Santa Barbara. “Whenever people have systems they can’t understand in other ways, this might be a tool to try to understand it,” he says.

The MIT team believes the approach could shed light on a group of rare metal compounds known as heavy fermion metals, whose electrons behave as if their masses were 100 to 1,000 times greater than those in ordinary metals. They also display some of the same non-Fermi liquid behavior seen in the strange metal phase of cuprates.

(Photo: Wikimedia commons)



0 comentarios

People with type 1 diabetes must keep a careful eye on their blood glucose levels: Too much sugar can damage organs, while too little deprives the body of necessary fuel. Most patients must prick their fingers several times a day to draw blood for testing.

To minimize that pain and inconvenience, researchers at MIT’s Spectroscopy Laboratory are working on a noninvasive way to measure blood glucose levels using light.

First envisioned by Michael Feld, the late MIT professor of physics and former director of the Spectroscopy Laboratory, the technique uses Raman spectroscopy, a method that identifies chemical compounds based on the frequency of vibrations of the bonds holding the molecule together. The technique can reveal glucose levels by simply scanning a patient’s arm or finger with near-infrared light, eliminating the need to draw blood.

Spectroscopy Lab graduate students Ishan Barman and Chae-Ryon Kong are developing a small Raman spectroscopy machine, about the size of a laptop computer, that could be used in a doctor’s office or a patient’s home. Such a device could one day help some of the nearly 1 million people in the United States, and millions more around the world, who suffer from type 1 diabetes.

Researchers in the Spectroscopy Lab have been developing this technology for about 15 years. One of the major obstacles they have faced is that near-infrared light penetrates only about half a millimeter below the skin, so it measures the amount of glucose in the fluid that bathes skin cells (known as interstitial fluid), not the amount in the blood. To overcome this, the team came up with an algorithm that relates the two concentrations, allowing them to predict blood glucose levels from the glucose concentration in interstitial fluid.

However, this calibration becomes more difficult immediately after the patient eats or drinks something sugary, because blood glucose soars rapidly, while it takes five to 10 minutes to see a corresponding surge in the interstitial fluid glucose levels. Therefore, interstitial fluid measurements do not give an accurate picture of what’s happening in the bloodstream.

To address that lag time, Barman and Kong developed a new calibration method, called Dynamic Concentration Correction (DCC), which incorporates the rate at which glucose diffuses from the blood into the interstitial fluid. In a study of 10 healthy volunteers, the researchers used DCC-calibrated Raman spectroscopy to significantly boost the accuracy of blood glucose measurements — an average improvement of 15 percent, and up to 30 percent in some subjects.

The researchers described the new calibration method and results in the July 15 issue of the journal Analytical Chemistry. In addition to Feld, Barman and Kong, authors include Ramachandra Rao Dasari, associate director of the Spectroscopy Lab, and former postdoctoral associate Gajendra Pratap Singh.

Michael Morris, professor of chemistry at the University of Michigan, says the group appears to have solved a problem that has long stymied researchers. “Getting optical glucose measurements of any sort is something people have been trying to do since the 1980s,” says Morris, who was not involved in this study. “Usually people report that they can get good measurements one day, but not the next, or that it only works for a few people. They can’t develop a universal calibration system.”

Morris says the noninvasive nature of Raman spectroscopy could help boost quality of life for diabetes patients, but that to be practical, any device would need to become more affordable and very simple to use. The Spectroscopy Lab researchers believe that the smaller machine they are now developing should substantially drive down costs by miniaturizing and reducing the complexity of the instrument.

Barman and Kong plan to launch a clinical study to test the DCC algorithm in healthy volunteers this fall. Their work is funded by the National Institutes of Health and National Center for Research Resources.

In October, Barman will receive the Tomas A. Hirschfeld Award at the Federation of Analytical Chemistry and Spectroscopy Societies Conference, for his work on improving spectroscopy-based glucose measurements.

(Photo: Patrick Gillooly)





Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com