Friday, December 31, 2010

BRAIN GENE A TRIGGER FOR DETERMINING GENDER

0 comentarios

University of Adelaide researchers are a step closer to unraveling the mysteries of human sexual development, following genetic studies that show male mice can be created without a Y chromosome – through the activation of an ancient brain gene.

Males usually have one Y chromosome and one X chromosome, while females have two X chromosomes. A single gene on the Y, called SRY, triggers testes development in the early embryo, and once these begin to form, the rest of the embryo also becomes male.

However, Adelaide researchers have discovered a way of creating a male mouse without a Y chromosome by activating a single gene, called SOX3, in the developing fetus. SOX3 is known to be important for brain development but has not previously been shown to be capable of triggering the male pathway.

In a major international collaborative study, they also have shown for the first time that changes in the human version of the same gene are present in some patients with disorders of sexual development.

The results of this work are published online today in the Journal of Clinical Investigation, and will be published in the journal's print version in January 2011.

"The Y chromosome contains a gene called SRY that functions as a genetic switch to activate the male pathway during embryonic development," says Associate Professor Paul Thomas from the University of Adelaide's School of Molecular & Biomedical Science.

"The SRY genetic switch is unique to mammals and is thought to have evolved from the SOX3 gene during early mammalian evolution."

Associate Professor Thomas and his colleagues have generated male mice with two X chromosomes by artificially activating the SOX3 gene in the developing gonads.

"These XX male 'sex reversed' mice are completely male in appearance, reproductive structures and behavior, but are sterile due to an inability to produce sperm," he says.

"We have suspected for a long time that SOX3 is the evolutionary precursor gene for SRY. By showing that SOX3 can activate the male pathway in the same way as SRY, we now believe this to be true."

This work is a longstanding collaboration between Associate Professor Thomas and Dr Robin Lovell-Badge at the Medical Research Council National Institute for Medical Research in London, who discovered the SRY gene in mice more than 20 years ago.

Dr Lovell-Badge says he's excited about the findings: "SOX3 normally functions in the development of the nervous system, but it is now clear that a mutation that makes it active in the early gonad can turn it into the switch that makes testes develop.

"It is now very likely that something similar to what has happened in the XX male mice and humans we describe also occurred in our early mammalian ancestors, and this led to the evolution not only of SRY, but of the X and Y chromosomes. Just think of all the trouble this little gene has caused!" he says.

Further collaborative research with Professor Andrew Sinclair at the Murdoch Children's Research Institute in Melbourne and Professor Eric Vilain at UCLA (University of California Los Angeles) has also shown that changes in the human SOX3 gene are present in some individuals who are XX males.

"From a genetic perspective, cases of XX male sex reversal are particularly intriguing and are poorly understood," Associate Professor Thomas says.

"This discovery provides new insight into the genetic causes of disorders of sexual development, which are relatively common in the community.

"For the future, this discovery will impact on the molecular diagnosis of these disorders and, ultimately, help us to develop therapies or technologies to improve clinical outcomes," he says.

(Photo: Sandra Piltz)

University of Adelaide

PLACEBOS WORK -- EVEN WITHOUT DECEPTION

0 comentarios
For most of us, the "placebo effect" is synonymous with the power of positive thinking; it works because you believe you're taking a real drug. But a new study rattles this assumption.

Researchers at Harvard Medical School's Osher Research Center and Beth Israel Deaconess Medical Center (BIDMC) have found that placebos work even when administered without the seemingly requisite deception.

The study published on December 22 in PLoS ONE.

Placebos—or dummy pills—are typically used in clinical trials as controls for potential new medications. Even though they contain no active ingredients, patients often respond to them. In fact, data on placebos is so compelling that many American physicians (one study estimates 50 percent) secretly give placebos to unsuspecting patients.

Because such "deception" is ethically questionable, HMS associate professor of medicine Ted Kaptchuk teamed up with colleagues at BIDMC to explore whether or not the power of placebos can be harnessed honestly and respectfully.

To do this, 80 patients suffering from irritable bowel syndrome (IBS) were divided into two groups: one group, the controls, received no treatment, while the other group received a regimen of placebos—honestly described as "like sugar pills"—which they were instructed to take twice daily.

"Not only did we make it absolutely clear that these pills had no active ingredient and were made from inert substances, but we actually had 'placebo' printed on the bottle," says Kaptchuk. "We told the patients that they didn't have to even believe in the placebo effect. Just take the pills."

For a three-week period, the patients were monitored. By the end of the trial, nearly twice as many patients treated with the placebo reported adequate symptom relief as compared to the control group (59 percent vs. 35 percent). Also, on other outcome measures, patients taking the placebo doubled their rates of improvement to a degree roughly equivalent to the effects of the most powerful IBS medications.

"I didn't think it would work," says senior author Anthony Lembo, HMS associate professor of medicine at BIDMC and an expert on IBS. "I felt awkward asking patients to literally take a placebo. But to my surprise, it seemed to work for many of them."

The authors caution that this study is small and limited in scope and simply opens the door to the notion that placebos are effective even for the fully informed patient—a hypothesis that will need to be confirmed in larger trials.

"Nevertheless," says Kaptchuk, "these findings suggest that rather than mere positive thinking, there may be significant benefit to the very performance of medical ritual. I'm excited about studying this further. Placebo may work even if patients know it is a placebo."

Public Library of Science (PLoS)

PSYCHOLOGISTS FIND SKILL IN RECOGNIZING FACES PEAKS AFTER AGE 30

0 comentarios

Scientists have made the surprising discovery that our ability to recognize and remember faces peaks at age 30 to 34, about a decade later than most of our other mental abilities.

Researchers Laura T. Germine and Ken Nakayama of Harvard University and Bradley Duchaine of Dartmouth College will present their work in a forthcoming issue of the journal Cognition.

While prior evidence had suggested that face recognition might be slow to mature, Germine says few scientists had suspected that it might continue building for so many years into adulthood. She says the late-blooming nature of face recognition may simply be a case of practice making perfect.

"We all look at faces, and practice face-watching, all the time," says Germine, a Ph.D. student in psychology at Harvard. "It may be that the parts of the brain we use to recognize faces require this extended period of tuning in early adulthood to help us learn and remember a wide variety of different faces."

Germine, Duchaine, and Nakayama used the web-based Cambridge Face Memory Test -- available at www.testmybrain.org -- to test recognition of computer-generated faces among some 44,000 volunteers ages 10 to 70. They found that skill at other mental tasks, such as remembering names, maxes out at age 23 to 24, consistent with previous research.

But on a face-recognition task, skill rose sharply from age 10 to 20, then continued increasing more slowly throughout the 20s, reaching a peak of 83 percent correct responses in the cohort ages 30 to 34.

A follow-up experiment involving computer-generated children's faces found a similar result, with the best face recognition seen among individuals in their early 30s. After this, skill in recognizing faces declined slowly, with the ability of 65-year-olds roughly matching that of 16-year-olds.

"Research on cognition has tended to focus on development, to age 20, and aging, after age 55," Germine says. "Our work shows that the 35 years in between, previously thought to be fairly static, may in fact be more dynamic than many scientists had expected."

(Laura Germine.) (Photo: Stephanie Mitchell)

Harvard University

Thursday, December 30, 2010

HUMAN NETWORKING THEORY GIVES PICTURE OF INFECTIOUS DISEASE SPREAD

0 comentarios

It's colds and flu season, and as any parent knows, colds and flu spread like wildfire, especially through schools.

New research using human-networking theory may give a clearer picture of just how, exactly, infectious diseases such as the common cold, influenza, whooping cough and SARS can spread through a closed group of people, and even through populations at large.
With the help of 788 volunteers at a high school, Marcel Salathé, a biologist at Penn State University, developed a new technique to count the number of possible disease-spreading events that occur in a typical day.

This results are published in the journal Proceedings of the National Academy of Sciences.

The research was funded by the National Science Foundation (NSF) and the National Institutes of Health (NIH).

"Contact networks, which are shaped by social and cultural processes, are keys to the spread of information and infection," says Deborah Winslow, NSF program director for cultural anthropology and the ecology of infectious diseases. "Before this research, the study of contact networks had been hampered by the lack of good data on their formation and structure."

"This setting proved a closed population in which the whole network could be determined. By collecting real-time network data, the researchers improved significantly on the usual error-prone techniques that depend on asking informants to recall their interactions."

Every day people come into contact with many other people; their interactions vary in length; and each contact is an opportunity for a disease to spread, Salathé said.

"But it's not like you can take a poll and ask people, 'How many different people have breathed on you today, and for how long?' We knew we had to figure out the number of person-to-person contacts systematically."

Using a population of high-school students, teachers and staff members as a model for a closed group of people, Salathé and his team designed a method to count how many times possible disease-spreading interactions occurred during a typical day.

Volunteers were asked to spend one school day wearing matchbox-sized sensor devices--called motes--on lanyards around their necks.

Like a cell phone, each mote was equipped with its own unique tracking number, and each mote was programmed to send and receive radio signals at 20-second intervals to record the presence of other nearby motes.

Volunteers then were asked to go about their day by attending classes, walking through the halls, and chatting with other people.

At the end of the day, Salathé's team collected the motes and recorded how many mote-to-mote interactions had occurred, and how long each interaction had lasted.

"An interaction isn't necessarily a conversation," Salathé said.

"Even when people aren't talking, they might be sneezing and coughing in each other's direction, bumping into each other, and passing around pathogens."

To record even these non-conversational events--any kind of spatial closeness that would be enough to spread a contagious disease--each mote used a 3-meter maximum signaling range, extending outward from the front of the person's body.

Defining a single interaction as any 20-second or longer event of mote-to-mote proximity, Salathé and his team found that the total number of close-proximity events was 762,868.

"The same two people may have had many very brief interactions," Salathé said. "Still, we have to count each brief interaction individually, even between the same two people."

"From a pathogen's point of view, each interaction is another chance to jump from person to person."

In addition, the team found peaks of interactions at times between classes, not surprisingly, when mote-wearing volunteers were physically closer to one another, moving around in the halls on their way to the next class.

Salathé and his team found that, at the end of the day, most people had experienced a fairly high number of person-to-person interactions, but they also found very little variation among individuals.

Strikingly, they did not find any individuals who had an extraordinarily high number of contacts when compared with the rest of the group. Such individuals--called super-spreaders--are known to be very important in the dynamics of disease spread.

"For example, in sexual-contact networks, one often finds a group of people with a much higher potential to contract and spread a virus such as HIV," Salathé said.

"This potential is due to these individuals' extremely high number of interactions. But in our experiment, while there may have been kids with a few more interaction events, for the most part, everyone had about the same high level of interaction."

Salathé explained that while schools may indeed be "hot beds" for colds and the flu, individual students do not seem to vary with regard to exposure risk due to their contact patterns.

Data from the motes also confirmed an important social-networking theory--that contact events are not random because many "closed triangles" exist within a community.

"If person A has contact with person B, and person B has contact with person C, chances are that persons A and C also have contact with each other," Salathé said.

"Real data illustrating these triangles provide just one more piece of information to help us track how a disease actually spreads."

Salathé also said that networking data such as his may help guide public-health initiatives such as vaccination strategies and prevention education.

(Photo: Kristen Devlin)

The National Science Foundation

WHAT "PINE" CONES REVEAL ABOUT THE EVOLUTION OF FLOWERS

0 comentarios

From southern Africa's pineapple lily to Western Australia's swamp bottlebrush, flowering plants are everywhere. Also called angiosperms, they make up 90 percent of all land-based, plant life.

New research published in the Proceedings of the National Academy of Sciences provides new insights into their genetic origin, an evolutionary innovation that quickly gave rise to many diverse flowering plants more than 130 million years ago. Moreover, a flower with genetic programming similar to a water lily may have started it all.

"Water lilies and avocado flowers are essentially 'genetic fossils' still carrying genetic instructions that would have allowed the transformation of gymnosperm cones into flowers," said biologist Doug Soltis, co-lead researcher at the University of Florida in Gainesville.

Gymnosperms are a group of seed-bearing plants that include conifers and cycads that produce "cones" as reproductive structures, one example being the well-known pine cone. "We show how the first flowering plants evolved from pre-existing genetic programs found in gymnosperm cones and then developed into the diversity of flowering plants we see today," he said. "A genetic program in the gymnosperm cone was modified to make the first flower."

But, herein is the riddle. How can flowers that contain both male and female parts develop from plants that produce cones when individual cones are either male or female? The solution, say researchers, is that a male gymnosperm cone has almost everything a flower has in terms of its genetic wiring.

Somehow a genetic change took place allowing a male cone to produce female organs as well--and, perhaps more importantly, allowed it to produce showy petal-like organs that enticed new interactions with pollination agents such as bees.

Analyzing genetic information encoded in a diverse array of evolutionarily distant flowers--water lily, avocado, California poppy and a small flowering plant frequently used by scientists as a model, Arabidopsis--researchers discovered support for the single cone theory.

A non-flowering seed plant, a cycad named Zamia, which makes pine cone-like structures instead of flowers, was also examined in the study.

"We extracted an essential genetic material, RNA, from the flowers' specific floral organs and in the case of Zamia, its cones, to see which genes were active," said co-lead investigator Pam Soltis, a curator at the Florida Museum of Natural History and an evolutionary geneticist at the University of Florida.

Researchers then compared the organs' profiles to a range of species representing ancient and more recent lineages of flowering plants. "This comparison allowed us to see aspects of the floral genetic program that are shared with gymnosperms, where they came from and also which aspects are shared among different groups of flowering plants and which differ," she explained.

The flowers of most angiosperms have four distinct organs: sepals, typically green; petals, typically colorful; stamens, male organs that produce pollen; and carpels, female organs that produce eggs. However, the flowers of more ancient lineages of angiosperms have organs that intergrade, or merge into one another through a gradual series of evolutionary reforms. For example, a stamen of a water lily produces pollen but it may also be petal-like and colorful and there is often no distinction between sepals and petals--instead, early flowers have organs called tepals.

The research team found a very significant degree of genetic overlap among intergrading floral organs in water lilies and avocado but less overlap in poppy and Arabidopsis. "In other words, the boundaries between the floral organs are not all that sharp in the early angiosperm groups-the organs are still being sorted out in a sense," said Doug Soltis.

The finding challenged researcher expectations that each floral organ in early angiosperms would have a unique set of genetic instructions as is the case in the evolutionarily derived Arabidopsis. Instead, the finding increased the likelihood that a single male cone was responsible for the world's first flowering plants owing to the elasticity of their genetic structure.

"In early flowers, a stamen is not much different genetically speaking than a tepal," said Doug Soltis. "The clearly distinct floral organs we all know and love today came later in flowering plant evolution--not immediately."

Researchers say better understanding of these genetic switches in early angiosperm flowers could one day help scientists in other disciplines such as medicine or agriculture.

(Photo: Randolph Femmer/life.nbii.gov)

The National Science Foundation

PEOPLE WHO BELIEVE IN JUSTICE ALSO SEE A VICTIMS LIFE AS MORE MEANINGFUL AFTER TRAGEDY

0 comentarios
Seeing bad things happen to other people is scary. One way to respond to this is to blame the victim—to look for some reason why it happened to them. But there’s another common response, according to a new study published in Psychological Science, a journal of the Association for Psychological Science. The researchers found that people who believe in justice in the world also believe that a tragedy gives the victim’s life more meaning.

“A lot of the time when people see someone else suffering, and helping them isn’t an option, people will instead justify the fact that something is negative is happening to them. Because it’s scary for something negative to happen to a good person—that means it could happen to you,” says Joanna E. Anderson of the University of Waterloo, who cowrote the study with her colleagues Aaron C. Kay and Gráinne M. Fitzsimons. Anderson suspected that there was another way to feel better about someone else’s tragic experience: to believe that the negative experience is balanced by positive outcomes.

In an experiment, volunteers read a scenario in which someone was injured playing soccer in high school. The soccer player ends up with a broken leg, has back problems, undergoes multiple surgeries, and can’t go to school with their peers. Everything is resolved by the end of high school; in the scenario, the person is now happily married and is thinking about starting a family. Each volunteer also filled out a survey that determined how strong their “justice motive” is—their need to see the world as just or fair. Then they were asked how much meaning they think the person’s life has.

People who had a strong need to see the world as just were more likely to say that a victim’s life is meaningful as opposed to the life of a person who hasn’t experienced a tragedy. This also held true in another experiment, in which the researchers manipulated the participants’ feelings about justice by having them read an article about how CEOs make a lot of money, but are hired because of personal connections rather than merit. The people who’d read about undeserving CEOs had a stronger justice motive and were more likely to see the injured soccer player’s later life as meaningful.

The results show that people who have a strong need to believe the world is fair may be motivated to find positive outcomes—”silver linings”—from tragedies. “I think that this is probably a more positive reaction” than blaming the victim, Anderson says. “But I do think that either reaction shows that you’re focusing so much on yourself and your own need to make sure that this can’t happen to you that you’re not really thinking about the other person at all.”

Association for Psychological Science

EYE SEE YOU

0 comentarios

Scientists have found a way of deterring litterbugs, in an experiment which could also aid the fight against other anti-social behaviour.

Researchers at Newcastle University alternated hanging posters of staring human faces and posters of flowers on the walls of a cafe. They then counted the number of people who cleaned their plates and rubbish away after finishing their meal in both situations.

In a paper, which is published online in the American journal Evolution and Human Behavior, the research team, lead by Dr Melissa Bateson and Dr Daniel Nettle of the Centre for Behaviour and Evolution, describe their findings.

During periods when the posters of faces were on the walls, watching over the diners, twice as many people cleaned up, compared to the periods when the pictures of flowers were overlooking the diners, when more litter was left for cafe workers to clear away.

In a previous study in 2006 the same scientists looked at the impact of images of eyes on contributions to an honesty box in a tea room. They found that people put nearly three times more money in the box when there were eyes compared with flowers.

For this follow-on experiment, psychology student, Max Ernest-Jones, eager to explore whether the honesty box findings would extend to other forms of cooperation, spent many hours sitting inconspicuously in the corner of the café recording customers’ littering behaviour.

Dr Bateson, who led the research, said: “These findings reinforce the conclusion from our previous research, that the presence of eye images can encourage co-operative behaviour. We think that the images of eyes work by making people feel watched. We care what other people think about us, and hence we behave better when we feel we are being observed.

“We found that the impact of the posters was a lot greater at times when the cafe was quiet. This makes total sense, because we would expect real people to have the greatest effect on the feeling of being watched and hence swamp the effect of the posters during busy times.

“This study has implications for the fight against anti-social behaviour. For example if signs for CCTV cameras used pictures of eyes instead of cameras they could be more effective.”

The study is based on the theory of ‘nudge psychology’ which suggests that people may behave better if the best option in a given situation is highlighted for them, but all other options are still left open, so the person isn’t forced into one particular action. In effect you ‘nudge’ people into doing the right thing.

Dr Bateson added: “This study confirms that the display of images of eyes has broad potential as a ‘nudge’, not just because eyes grab attention, but because of more fundamental connections between the feeling of being watched and cooperative behaviour.

“Even painting a pair of eyes on a wall may be useful for preventing anti-social behaviour in quiet locations.”

(Photo: U. Newcastle)

Newcastle University

NEW STUDY SHOWS EARTHLY GOLD CAME FROM 'ALIEN' BOMBARDMENT

0 comentarios

Prices for gold are at seemingly unworldly highs these days, and maybe that's fitting in light of a new study that says almost all the gold we humans hold so dear was likely delivered to Earth by massive planetoids that crashed into our planet late in its formation some 4.5 billion years ago.

Published in the journal Science, the findings provide weighty new evidence that the gold, platinum, palladium and other iron-loving elements found in the crusts and mantles of Earth, the Moon and Mars arrived on mini-planet-sized impactors during the final phase of planet formation in our solar system. These massive collisions occurred within tens of millions of years after the even bigger impact that produced our Moon, say the authors, a team of researchers from the University of Maryland, the Southwest Research Institute, the Massachusetts Institute of Technology, and the Scripps Institution of Oceanography.

"Our understanding of the formation of Earth and other planets with iron cores and silicate mantles suggests that iron-loving elements are pulled into the planet cores as they form," said University of Maryland Geology Professor Richard Walker, one of the authors of the new study. "Thus, we should have an Earth that essentially has no gold or other iron-loving metal ores in its crust for us to mine."

The fact that we do, Walker said, has long suggested that something must have happened to bring new iron-loving elements to Earth after completion of the separation of the metallic core and silicate mantle. What scientists didn't know until now was whether this late accretion of material occurred in big chunks over a relatively short period of time or as a 'rain' of smaller pieces of material over a longer time.

To determine the answer, Walker and colleagues James Day of the University of Maryland and the Scripps Institution of Oceanography, William Bottke and David Nesvorny from the Southwest Research Institute and Linda Elkins-Tanton from MIT, used numerical models to see what size objects would best match the needed criteria.

These criteria included (1) providing right amount of iron-loving metals to the Earth, Moon and Mars; (2) being large enough to breach the crusts and mantles of these bodies, creating local molten rock ponds from their impact energy and efficiently mixing into the mantle; (3) not being so large as to cause a fragmenting and reformation of the planet cores. The latter would have resulted in most of the newly added iron-loving elements being pulled down into the cores as well.

The researchers showed that they could best reproduce these results if the late accretion population was dominated by a very limited number of massive projectiles. Their results indicate the largest Earth impactor was 1500-2000 miles in diameter, roughly the size of Pluto, while those hitting the Moon were only 150-200 miles across.

"These impactors are thought to be large enough to produce the observed enrichments in highly siderophile [iron-loving] elements, but not so large that their fragmented cores joined with the planet's core," said Southwest Research Institute's Bottke, who was the lead author of the Science paper.

The team also reports that their predicted projectile sizes also are consistent with physical evidence such as the size distributions of today's asteroids and of ancient Martian impact scars.

(Photo: U. Maryland)

University of Maryland

HIGH ACTIVITY STAVES OFF POUNDS

0 comentarios
People will gain significantly less weight by middle age – especially women – if they engage in moderate to vigorous activity nearly every day of the week starting as young adults, according to new Northwestern Medicine research.

Women particularly benefitted from high activity over 20 years, gaining an average of 13 pounds less than those with low activity; while men with high activity gained about 6 pounds less than their low-activity peers. High activity included recreational exercise such as basketball, running, brisk walking or an exercise class or daily activities such as housework or construction work.

“Everyone benefits from high activity, but I was surprised by the gender differences,” said lead author Arlene Hankinson, M.D., an instructor in preventive medicine at Northwestern University Feinberg School of Medicine. “It wasn’t that activity didn’t have an effect in men, but the effect was greater in women. Now women should be especially motivated.”

The study was published Dec. 14 in the Journal of the American Medical Association.

There could be several reasons for the gender difference, Hankinson said. Women are less likely than men to overestimate their activity, according to previous studies. “Men may not be getting as much activity as they report,” Hankinson explained.

In addition, men in the high-activity group compensated by eating more than their low-activity counterparts, which could have led to more weight gain. The highly active women didn’t eat more than low-activity women in the study.

There were many ways to achieve the study’s definition of high-activity levels, Hankinson noted. One way was 150 minutes of moderate to vigorous activity a week.

The study participants -- 1,800 women and nearly 1,700 men — are part of the Coronary Artery Risk Development in Young Adults (CARDIA) Study, a multi-center, longitudinal and population-based observational study designed to describe the development of risk factors for coronary heart disease in young black and white adults.

“This paper is another example of how the CARDIA study has contributed to our knowledge about the importance of initiating healthy habits early in life and vigilantly maintaining them,” said paper coauthor Stephen Sidney, M.D., associate director for clinical research at the Kaiser Permanente Division of Research. “Common medical problems such as heart disease, diabetes and obesity have their origins in childhood and can generally be prevented by maintaining a normal weight, not smoking, exercising regularly and eating a healthy diet throughout life.”

Hankinson’s research is the first to measure the impact of high activity over 20 years between young adulthood and middle age and to frequently examine participants (seven times) over that period. Study participants are more likely to remember and accurately report their behavior with regular exams, she said.

Previous studies, Hankinson said, looked at a single exercise intervention’s effect on weight for a short period of time or examined participants in longer studies at only two points in time -– the beginning and the end.

“We wanted to see if people’s activity levels during their youth were enough to help them keep weight off in middle age, or if they needed to up the ante,” Hankinson said. “It’s difficult to avoid gaining weight as you age. Our metabolic rate goes down. We develop conditions or have lifestyles that make it harder to maintain a high level of activity.”

“The study reinforces that everyone needs to make regular activity part of their lifestyles throughout their lives,” she said. “Not many people actually do that.” The active group in the study comprised only 12 percent of the participants.

Lower levels of activity had a negligible effect on weight gain in the study. “High activity was the only kind that made a significant difference,” Hankinson noted.

Northwestern University

EATING AT SCREEN CAN LEAD TO LATER SNACK ATTACKS

0 comentarios

Eating while playing a computer game or simply working through lunch could increase your food intake later in the day.

Researchers from the Nutrition and Behaviour Unit in the School of Experimental Psychology have been exploring ways in which memory and attention influence our appetite and food intake.

In a recent study they assessed the effect of eating while playing a computer game. Participants were split into two groups. One group ate a lunch that comprised nine different foods while playing ‘Solitaire’ – a computerised card-sorting game. The second group ate the same lunch, but without distraction.

The researchers found that participants who played Solitaire felt less full after lunch. Moreover, the effects of distraction were long lasting. Thirty minutes later, the distracted participants ate around twice as many snacks as did non-distracted participants. Finally, at the end of the test session, the participants tried to remember the food items that they had been given for lunch. Distracted participants had a poorer memory.

Together, these findings highlight an important role for memory of recent eating and they show that distraction can lead to increased food intake later in the day.

Previously, similar observations have been made in people who eat while watching TV. This study extends these findings by showing how other ‘screen-time activities’ can influence our food intake in unexpected ways. This is important, because it reveals another mechanism by which sedentary screen-time activities might promote obesity.

Dr Jeff Brunstrom, Reader in Behavioural Nutrition and one of the authors of this paper, remarked:

‘This work adds to mounting evidence from our lab and others that cognition, and memory and attention in particular, play a role in governing appetite and meal size in humans.’

(Photo: Bristol U.)

Bristol University

METEORITE JUST ONE PIECE OF AN UNKNOWN CELESTIAL BODY

0 comentarios

Scientists from all over the world are taking a second, more expansive, look at the car-sized asteroid that exploded over Sudan's Nubian Desert in 2008. Initial research was focused on classifying the meteorite fragments that were collected two to five months after they were strewn across the desert and tracked by NASA's Near Earth Object astronomical network. Now in a series of 20 papers for a special double issue of the journal Meteoritics and Planetary Science, published on December 15, researchers have expanded their work to demonstrate the diversity of these fragments, with major implications for the meteorite's origin.

In the first round of research, Carnegie Geophysical scientist Doug Rumble, in collaboration with Muawia Shaddad of the University of Khartoum, examined one fragment of the asteroid, called 2008 TC3, and determined that it fell into a very rare category of meteorite called ureilites. Ureilites have a very different composition from most other meteorites. It has been suggested that all members of this meteoric family might have originated from the same source, called the ureilite parent body, which could have been a proto-planet.

Now Rumble has expanded his work to examine 11 meteorite fragments, focusing on the presence of oxygen isotopes. Isotopes are atoms of the same element that have extra neutrons in their nuclei.

Rumble explains: "Oxygen isotopes can be used to identify the meteorite's parent body and determine whether all the fragments indeed came from the same source. Each parent body of meteorites in the Solar System, including the Moon, Mars, and the large asteroid Vesta, has a distinctive signature of oxygen isotopes that can be recognized even when other factors, such as chemical composition and type of rock, are different."

Rumble and his team prepped tiny crumbs of these 11 meteorite fragments and loaded them into a reaction chamber where they were heated with a laser and underwent chemical reactions to release oxygen and then used another device, called a mass spectrometer, to measure the concentrations of these oxygen isotopes. Results showed that the full range of oxygen isotopes known to be present in ureilites were also present in the studied fragments.

"It was already known that the fragments in the Nubian Desert came from the same asteroid. Taking that into account, these new results demonstrate that the asteroid’s source, the ureilite parent body, also had a diversity of oxygen isotopes," says Rumble.
The diversity of oxygen isotopes found in ureilites probably arises from the circumstances of the parent this body's formation. Rumble theorizes that the rock components of this parent body were heated to the point of melting and then cooled into crystals so quickly that the oxygen isotopes present could not come to an equilibrium distribution throughout.

Together the collection of 20 papers published in Meteoritics and Planetary Science offer enormous insight about the formation and composition of ureilites and their hypothesized parent body.

(Photo: Carnegie I.)

Carnegie Institution

THE BRAIN: LIGHT-CONTROLLED NEO-NEURO

0 comentarios

Researchers at the Institut Pasteur in association with the CNRS have just shown, in an experimental model, that newly formed neurons in the adult brain can be stimulated by light.

A novel technique associating optical and genetic tools allows neurobiologists to render neo-neurons photo-excitable. For the first time they have prompted, observed, and specifically recorded the activity of these new nerve cells in the olfactory system. Using this technique the scientists have revealed the nature of signals emitted from new neurons across neuronal circuits in the brain. This work represents an essential step towards better understanding the role of new nerve cells and in developing therapeutic applications, most notably in the realm of neurodegenerative diseases.

Pierre-Marie Lledo and his team in the Perception and Memory unit at the Institut Pasteur (CNRS, URA 2182) have just shown, for the first time in an animal model, the possibility of using light to stimulate and specifically study new neurons which form in the adult brain. Until now the existing methods of stimulation did not permit this. Electrical stimulation affects all cells without discrimination and chemical stimulation concerns only neurons mature enough to have surface receptors for active molecules.

By introducing and inducing expression of photo-sensitive proteins in new neurons, the scientists have been able to control their activity with the use of luminescent flashes. Using this technique, researchers at the Institut Pasteur and the CNRS have been able to observe, stimulate, and specifically record the activity of new nerve cells. They have brought proof that new neurons formed in the olfactory bulb of the adult brain are integrated into preexisting nervous circuits. They have also shown that, against all expectations, the number of contacts between young cells and their target cells greatly increased over several months.

This work constitutes an essential step in characterizing the functions fulfilled by new neurons. It opens new avenues to investigation for understanding the connectivity between “newly formed” neurons and their host circuits. This is a crucial step on the way to foreseeing the use of stem cells within the framework of new therapeutic protocols for repairing brain damage, notably in the realm of neurodegenerative diseases.

(Photo: © Institut Pasteur)

CNRS

Wednesday, December 29, 2010

SCIENTIST SHOWS LINK BETWEEN DIET AND ONSET OF MENTAL ILLNESS

0 comentarios
Changes in diet have been linked to a reduction of abnormal behaviors in mentally ill people or animals, but a Purdue University study shows that diet might also trigger the onset of mental illness in the first place.

Joseph Garner, an associate professor of animal sciences, fed mice a diet high in sugar and tryptophan that was expected to reduce abnormal hair-pulling. Instead, mice that were already ill worsened their hair-pulling behaviors or started a new self-injurious scratching behavior, and the seemingly healthy mice developed the same abnormal behaviors.

"This strain of mouse is predisposed to being either a scratcher or a hair-puller. Giving them this diet brought out those predispositions," said Garner, whose results were published in the December issue of the journal Nutritional Neuroscience. "They're like genetically at-risk people."

Garner studies trichotillomania, an impulse-control disorder in which people pull out their hair. The disorder, which disproportionately occurs in women, is thought to affect between 2 percent and 4 percent of the population.

Mice that barber, or pull their hair out, have been shown to have low levels of serotonin activity in the brain. That neurotransmitter is known to affect mood and impulses. Garner hypothesized that increasing serotonin activity in the brain might cure or reduce barbering and possibly trichotillomania.

Serotonin is manufactured in the brain from the amino acid tryptophan, which is consumed in diets. The problem is that tryptophan often doesn't make it across the barrier between blood and the brain because other amino acids can get through more easily and essentially block the door for tryptophan.

Garner modified a mouse diet to increase simple carbohydrates, or sugars, and tryptophan. The sugars trigger a release of insulin, which causes muscles to absorb those other amino acids and gives tryptophan a chance to make it to the brain.

Using eight times as much sugar and four times as much tryptophan, Garner observed a doubling of serotonin activity in the brain. But the mice that barbered did not get better.

"We put them on this diet, and it made them much, much worse," Garner said.

A second experiment divided the mice into three groups: those that were seemingly normal, others that had some hair loss due to barbering and a group that had severe hair loss. All the mice soon got worse, with conditions escalating over time.

"Three-quarters of the mice that were ostensibly healthy developed one of the behaviors after 12 weeks on the new diet," Garner said.

Some of the mice developed ulcerated dermatitis, a fatal skin condition thought to be caused by an unidentified pathogen or allergen. Garner saw that the only mice that contracted the condition were the scratchers.

"What if ulcerated dermatitis, like skin-picking, another common behavioral disorder, is not really a skin disease at all?" Garner said. "We now have evidence that it may be a behavioral disorder instead."

When taken off the new diet, the negative behaviors stopped developing in the mice. When control mice were switched to the new diet, they started scratching and barbering.

Garner's study raises questions of how diet might be affecting other behavioral or mental illnesses such as autism, Tourette syndrome, trichotillomania and skin-picking. He said that before now, a link between diet and the onset of mental disorders hadn't been shown.

"What if the increase of simple sugars in the American diet is contributing to the increase of these diseases?" Garner said. "Because we fed the mice more tryptophan than in the typical human diet, this experiment doesn't show that, but it certainly makes it a possibility."

Garner next wants to refine the experiments to better imitate human dietary habits, including the amount of tryptophan people consume. Internal Purdue funding paid for his work.

Purdue University

TRACKS SHOW DINOSAURS ROAMED ALASKA IN JURASSIC PERIOD

0 comentarios

Until last summer, recent discoveries of dinosaur bones and tracks in Alaska have been restricted to the Cretaceous Period.

That changed when a team including University of Alaska Fairbanks scientists documented fossilized tracks in Southwest Alaska that appear to date from the Jurassic Period, which stretches from 150 to 200 million years ago.

“In one fell swoop we pushed the record of dinosaurs in Alaska back about 50 million years,” said Patrick Druckenmiller, earth sciences curator at the University of Alaska Museum of the North and assistant professor in the UAF geology and geophysics department.

In 1975, geologists mapping the rocks near Chignik Bay discovered what appeared to be three-toed dinosaur tracks on a sandstone cliff. The group photographed the site but did not collect any other data. Thirty-five years later, Druckenmiller and a team of scientists set out to find the location in that photo and fully document the site. The team included Kevin May from the museum, UAF geologists Sarah Fowell and Paul McCarthy, and invertebrate paleontologist Robert Blodgett of Anchorage.

Planning the expedition presented logistical challenges. The field area is in remote and mountainous terrain famous for its high density of coastal brown bears. The precise location of the tracks was also uncertain, so Druckenmiller received permission to work on both Chignik Lagoon Native Corporation land and in the Alaska Peninsula National Wildlife Refuge. The work was based out of Chignik Bay, Alaska.

“It was great to land in a community that was very receptive and accommodating to the field work we had to do,” said Druckenmiller.

Supported by helicopter pilot Sam Egli of King Salmon, the team established a remote field camp and set to work. May said they found the site after only two days of searching. “After staring at the 1975 photograph for so long, it was a real thrill to finally see it in real life.”

The layer of tracks was tilted nearly vertically and could only be reached with the use of climbing equipment. Once they reached the site, Druckenmiller and May made replicas of each track for study and exhibit back at the museum.

Druckenmiller said the trip netted a surprising amount of information.

“Based on their size and shape we can tell that the tracks were made by a human-sized, meat-eating (theropod) dinosaur,” he said. “We could even see impressions from tips of their claws. That makes these tracks especially rare.”

The rest of the team examined the rocks for additional clues and were able to establish that these dinosaurs walked on sand in a beach-type environment during the Late Jurassic Period, long before modern Alaska took shape.

Druckenmiller said the findings provide an entirely new chapter in the story of the life that once existed in Alaska and he hopes to return to the site in the near future. “We are pretty sure there are other surprises waiting for us out there.”

(Photo: Kevin May, UA Museum of the North)

University of Alaska Fairbanks

ASSESSING THE ENVIRONMENTAL EFFECTS OF TIDAL TURBINES

0 comentarios

Harnessing the power of ocean tides has long been imagined, but countries are only now putting it into practice. A demonstration project planned for Puget Sound will be the first tidal energy project on the west coast of the United States, and the first array of large-scale turbines to feed power from ocean tides into an electrical grid.

University of Washington researchers are devising ways to site the tidal turbines and measure their environmental effects. Brian Polagye, UW research assistant professor of mechanical engineering, presented recent findings in an invited talk at the American Geophysical Union's annual meeting in San Francisco.

Polagye and colleagues are involved in environmental monitoring before and during a planned deployment of two 30-foot-wide turbines in Admiralty Inlet, the main entrance to Washington state's Puget Sound.

"There really isn't that much information, anywhere, about the environmental effects of tidal turbines," Polagye said.

Although European countries have more experience with tidal energy devices, they are not as far ahead on environmental monitoring, Polagye said. He believes the Pacific Northwest installation will have the most comprehensive environmental monitoring of any tidal project so far.

"The results of this pilot project will help decide if this is an industry that has potential for going forward at the commercial scale, or if it stops at the pilot stage," Polagye said.

The Snohomish County Public Utility District, just north of Seattle, received a $10 million grant from the Energy Department for the tidal project now in the final phase of obtaining permits. The turbines would generate an average of 100 kilowatts of electricity, enough to power 50-100 Washington homes during the pilot phase.

"We want to monitor the effects of this particular project, but also understand the processes so we can apply the findings to other potential tidal energy sites," Polagye said.

To do this, the UW team must assess a new technology that operates in a little-explored environment.

"There's surprisingly little known about the oceanography of these very fast waters," said collaborator Jim Thomson, a UW assistant professor of civil and environmental engineering and an oceanographer in the UW's Applied Physics Laboratory. "These kinds of tidal channels where water is going very fast only happen in a few areas, and have not been well studied. The currents are so fast that it's hard to operate vehicles and maintain equipment. And it's too deep for conventional scuba diving."

The pilot site lies roughly 200 feet below the surface of Admiralty Inlet, where the UW team has measured currents of up to 8 knots, or 9 miles per hour.

One area of concern is how underwater noise generated by the turbines could affect marine mammals that use auditory cues to navigate and communicate with each other. Strong currents complicated the task of measuring how sound travels in the channel.

"When currents were more than about 2 knots the instruments are hearing considerable self-noise," Polagye said. "It's similar to when you're bicycling downhill and the air rushes past your ears." Chris Bassett, a UW doctoral student in mechanical engineering, is testing approaches that would allow underwater microphones to work in fast-moving water.

UW researchers used sound from a Washington state ferry to learn how turbine noise would spread from the project site. The data suggest that Admiralty Inlet tends to lessen sound. This reduces the effect on animals' hearing, which is good, but it also means less noise for marine mammals to detect turbines and avoid them.

The UW team has been measuring currents continuously at the proposed site for almost two years, using a monitoring tripod the size of a small refrigerator. With added ballast for stability, the device weighs 850 pounds in water. Even so, it can barely stay put on the ocean floor.

The monitoring tripod holds instruments to track water quality, ambient noise, currents, temperature and salinity, and to record marine mammal calls and electronic tags on passing fish. This observational data will help determine precisely where to put the tidal turbines, and establish potential environmental effects once they are in the water.

So far, researchers say, the data support the notion that the Admiralty Inlet is well suited for a tidal energy installation from an engineering perspective. Once the turbines are in the water, likely in 2013, researchers will monitor environmental effects.

The Admiralty Inlet characterization is being conducted by the Northwest National Marine Renewable Energy Center, in which the UW leads research on tidal energy. Polagye and Thomson lead research on characterizing the physical attributes, such as currents and sound propagation. UW fisheries scientists recently received funding to test instruments for monitoring fish at the site, UW mechanical engineers are using computer models to see how pressure changes caused by tidal turbines could affect sediments and fish, and UW oceanographers are calculating when turbines would begin to affect the Sound's tides and currents.

The Washington state deployment is among three U.S. tidal energy pilot projects now in the works (the others are in Maine and Alaska). An array of smaller turbines was operated during a pilot project in New York City's East River.

(Photo: OpenHydro Technology Ltd.)

University of Washington

Tuesday, December 28, 2010

SIPPING GREEN TEA REGULARLY CAN ALTER HOW WE PERCEIVE FLAVOR

0 comentarios

While trying to figure out what makes certain beverages cloudy, Cornell researchers made the startling discovery that certain chemicals in green tea -- and perhaps red wine -- react with saliva in ways that can alter how we perceive flavors.

Specifically, regular consumption of the polyphenol-rich drinks can boost astringent sensations and our sensitivity to acids, reports Karl Siebert, professor of food science, in an article published online in Food Quality and Preference Sept. 21 and in print January 2011.

Siebert also discovered that we all have varying levels of polyphenols already stored in our systems.

Siebert, who worked for 18 years in a brewery before becoming an academic, stumbled upon the finding while studying the relationship between polyphenols -- chemical compounds found in plants -- and protein chains in such drinks as beer and apple juice.

It was well known that the two combine to form complexes. The larger these complexes grow, the less soluble they become, until they become visible to the human eye in the form of haze or turbidity.

Siebert's group discovered the strong effect of pH on haze formation, peaking at a pH level near 4. More acidic beverages like grape juice don't get as cloudy. Higher pHs also lead to less haze.

These findings led Siebert to question whether the same thing happened in people's mouths.

We perceive astringency, which is a tactile (touch) sensation, similarly to how we perceive the cooling of menthol and the heating (or pain) of capsaicin, the hot pepper compound. The traditional thinking was that the astringency was caused by a loss of lubrication when polyphenols reacted with proteins in saliva.

Siebert wondered if pH levels made a difference there too, and if so, why.

"We had this idea because of what we had seen before about the protein effect in beverages, and we knew that acid together with polyphenols tastes more astringent than either alone," Siebert said.

He presented several dilute solutions of acid to a group of panelists, who rated the intensity of astringency. While most reported a mild difference, others had more dramatic sensitivity. Digging deeper, he discovered the most sensitive had been regular green tea drinkers prior to the start of the study.

He then measured the polyphenol levels in saliva of people on days before, during and after they consumed several cups of green tea. This showed that saliva normally contains polyphenols, and there are large differences among individuals. Regular red wine and green tea drinkers had the highest levels. Drinking green tea was shown to elevate the saliva polyphenol levels.

"I would expect that red wine drinking would also, but we didn't demonstrate this," Siebert said.

The polyphenol level in saliva returns to an individual's baseline level within half an hour after consuming such beverages as coffee or tea; but over time, the underlying baseline level gradually rises with the continued consumption of tea.

"It appears that there is a metabolic pool of polyphenol that is influenced by dietary habits, and that the salivary polyphenol level influences perception of astringency caused by acids," Siebert said.

"This was the first demonstration that you normally have polyphenols in your saliva," he added. "That has some other implications, because the liquid in your saliva comes from the blood. So the long-term build-up must be in the blood."

This may help explain what has been labeled "the French paradox" -- the observation that French people have a relatively low incidence of heart disease, despite their diet rich in saturated fats. Some scientists believe it is due to their increased consumption of red wine, and have attributed it to the antioxidant benefits of polyphenols.

(Photo: Cornell U.)

Cornell University

PERIODIC TABLE REVAMP

0 comentarios
For the first time in history, a change will be made to the atomic weights of some elements listed on the periodic table of the chemical elements posted on walls of chemistry classrooms and on the inside covers of chemistry textbooks worldwide.

The new table, outlined in a report released this month, will express atomic weights of 10 elements - hydrogen, lithium, boron, carbon, nitrogen, oxygen, silicon, sulfur, chlorine and thallium - in a new manner that will reflect more accurately how these elements are found in nature.

“For more than a century and a half, many were taught to use standard atomic weights — a single value — found on the inside cover of chemistry textbooks and on the periodic table of the elements. As technology improved, we have discovered that the numbers on our chart are not as static as we have previously believed,” says Dr. Michael Wieser, an associate professor at the University of Calgary, who serves as secretary of the International Union of Pure and Applied Chemistry’s (IUPAC) Commission on Isotopic Abundances and Atomic Weights. This organization oversees the evaluation and dissemination of atomic-weight values.

Modern analytical techniques can measure the atomic weight of many elements precisely, and these small variations in an element’s atomic weight are important in research and industry. For example, precise measurements of the abundances of isotopes of carbon can be used to determine purity and source of food, such as vanilla and honey. Isotopic measurements of nitrogen, chlorine and other elements are used for tracing pollutants in streams and groundwater. In sports doping investigations, performance-enhancing testosterone can be identified in the human body because the atomic weight of carbon in natural human testosterone is higher than that in pharmaceutical testosterone.

The atomic weights of these 10 elements now will be expressed as intervals, having upper and lower bounds, reflected to more accurately convey this variation in atomic weight. The changes to be made to the Table of Standard Atomic Weights have been published in Pure and Applied Chemistry (http://iupac.org/publications/pac/asap/PAC-REP-10-09-14/) and a companion article in Chemistry International (http://www.iupac.org/publications/ci/2011/3302/1_coplen.html)

For example, sulfur is commonly known to have a standard atomic weight of 32.065. However, its actual atomic weight can be anywhere between 32.059 and 32.076, depending on where the element is found. “In other words, knowing the atomic weight can be used to decode the origins and the history of a particular element in nature,” says co-author Wieser of the Department of Physics and Astronomy.

Elements with only one stable isotope do not exhibit variations in their atomic weights. For example, the standard atomic weights for fluorine, aluminum, sodium and gold are constant, and their values are known to better than six decimal places.

“Though this change offers significant benefits in the understanding of chemistry, one can imagine the challenge now to educators and students who will have to select a single value out of an interval when doing chemistry calculations,” says Dr. Fabienne Meyers, associate director of IUPAC.

“We hope that chemists and educators will take this challenge as a unique opportunity to encourage the interest of young people in chemistry and generate enthusiasm for the creative future of chemistry.”

The University of Calgary has and continues to contribute substantially in the study of atomic weight variations. Professor H. Roy Krouse created the Stable Isotope Laboratory in the Department of Physics and Astronomy in 1971. Early work by Krouse established the wide natural range in the atomic weight of significant elements including carbon and sulfur. Currently, researchers at the University of Calgary in physics, environmental science, chemistry and geoscience are exploiting variations in atomic weights to elucidate the origins of meteorites, to determine sources of pollutants to air and water, and to study the fate of injected carbon dioxide in geological media.

This fundamental change in the presentation of the atomic weights is based upon work between 1985 and 2010 supported by IUPAC, the University of Calgary and other contributing Commission members and institutions.

University of Calgary

ANCIENT FOREST EMERGES MUMMIFIED FROM THE ARCTIC

0 comentarios

The northernmost mummified forest ever found in Canada is revealing how plants struggled to endure a long-ago global cooling.

Researchers believe the trees -- buried by a landslide and exquisitely preserved 2 to 8 million years ago -- will help them predict how today’s Arctic will respond to global warming.

They also suspect that many more mummified forests could emerge across North America as Arctic ice continues to melt. As the wood is exposed and begins to rot, it could release significant amounts of methane and carbon dioxide into the atmosphere -- and actually boost global warming.

Joel Barker, a research scientist at Byrd Polar Research Center and the School of Earth Sciences at Ohio State University and leader of the team that is analyzing the remains, described early results at the American Geophysical Union meeting in San Francisco on Friday, December 17.

Over the summer of 2010, the researchers retrieved samples from broken tree trunks, branches, roots, and even leaves -- all perfectly preserved -- from Ellesmere Island National Park in Canada.

“Mummified forests aren’t so uncommon, but what makes this one unique is that it’s so far north. When the climate began to cool 11 million years ago, these plants would have been the first to feel the effects,” Barker said. “And because the trees’ organic material is preserved, we can get a high-resolution view of how quickly the climate changed and how the plants responded to that change.”

Barker found the deposit in 2009, when he was camping on Ellesmere Island for an unrelated research project. He followed a tip from a national park warden, who had noticed some wood sticking out of the mud next to a melting glacier. This summer, he returned with colleagues for a detailed study of the area.

Analysis of the remains has only just begun, but will include chemical and DNA testing.

For now, the researchers have identified the species of the most common trees at the site -- spruce and birch. The trees were at least 75 years old when they died, but spindly, with very narrow growth rings and under-sized leaves that suggest they were suffering a great deal of stress when they were alive.

“These trees lived at a particularly rough time in the Arctic,” Barker explained. “Ellesmere Island was quickly changing from a warm deciduous forest environment to an evergreen environment, on its way to the barren scrub we see today. The trees would have had to endure half of the year in darkness and in a cooling climate. That’s why the growth rings show that they grew so little, and so slowly.”

Colleagues at the University of Minnesota identified the wood from the deposit, and pollen analysis at a commercial laboratory in Calgary, Alberta revealed that the trees lived approximately 2 to 8 million years ago, during the Neogene Period. The pollen came from only a handful of plant species, which suggests that Arctic biodiversity had begun to suffer during that time as well.

The team is now working to identify other mummified plants at the site, scanning the remains under microscopes to uncover any possible seeds or insect remains.

Now that the forest is exposed, it’s begun to rot, which means that it’s releasing carbon into the atmosphere, where it can contribute to global warming.

Team member David Elliot, professor emeritus of earth sciences at Ohio State, said that the mummified forest on Ellesmere Island doesn’t pose an immediate threat to the environment, though.

“I want to be clear -- the carbon contained in the small deposit we’ve been studying is trivial compared to what you produce when you drive your car,” he said. “But if you look at this find in the context of the whole Arctic, then that is a different issue. I would expect other isolated deposits to be exposed as the ice melts, and all that biomass is eventually going to return to carbon dioxide if it’s exposed to the air.”

“It’s a big country, and unless people decide to walk all across the Canadian Arctic, we won’t know how many deposits are out there,” he added.

(Photo: Joel Barker, courtesy of Ohio State University)

Ohio State University

CITIZEN SCIENTISTS JOIN SEARCH FOR EARTH-LIKE PLANETSv

0 comentarios

Web users around the globe will be able to help professional astronomers in their search for Earth-like planets thanks to a new online citizen science project called Planet Hunters that launches Dec. 16. at www.planethunters.org.

Planet Hunters, which is the latest in the Zooniverse citizen science project collection, will ask users to help analyze data taken by NASA’s Kepler mission. The space telescope has been searching for planets beyond our own solar system—called exoplanets—since its launch in March 2009.

“The Kepler mission has given us another mountain of data to sort through,” said Kevin Schawinski, a Yale University astronomer and Planet Hunters co-founder. Schawinski also helped create the Galaxy Zoo citizen science project several years ago, which enlisted hundreds of thousands of web users around the world to help sort through and classify a million images of galaxies taken by a robotic telescope.

The Kepler space telescope is continually monitoring nearly 150,000 stars in the nearby constellation Cygnus, recording their brightness over time. Astronomers analyze these images, looking for any stars that show a slight dimming of their brightness. This dimming could represent a planet passing in front of its host star, blocking a tiny fraction of its light as seen from Kepler’s vantage point in space. Those stars that periodically dim are the best candidates for hosting relatively small planets that tightly orbit their stars, similar to Earth.

“The Kepler mission will likely quadruple the number of planets that have been found in the last 15 years, and it’s terrific that NASA is releasing this amazing data into the public domain,” said Debra Fischer, a Yale astronomer and leading exoplanet hunter. Although Planet Hunters is not tied directly to the Kepler mission, the website will serve as a complement to the work being done by the Kepler team to analyze the data.

Because of the huge amount of data being made available by Kepler, astronomers rely on computers to help them sort through the data and search for possible planet candidates. “But computers are only good at finding what they’ve been taught to look for,” said Meg Schwamb, another Yale astronomer and Planet Hunters co-founder. “Whereas the human brain has the uncanny ability to recognize patterns and immediately pick out what is strange or unique, far beyond what we can teach machines to do.”

After the success of the Galaxy Zoo project, the Yale team decided to enlist web users once again to create what they hope will become a global network of human computing power.

When users log on to the Planet Hunters website, they’ll be asked to answer a series of simple questions about one of the stars’ light curves—a graph displaying the amount of light emitted by the star over time—to help the Yale astronomers determine whether it displays a repetitive dimming of light, identifying it as an exoplanet candidate.

“The great thing about this project is that it gives the public a front row seat to participate in frontier scientific research,” Schwamb said.

The possibility of Earth-like planets beyond our own solar system has captured the collective human imagination for centuries. Today, astronomers have discovered more than 500 planets orbiting stars other than the Sun—yet almost all of these so-called exoplanets are large gas giants, similar to Jupiter, which bear little resemblance to Earth. Ever since the first exoplanet was discovered in 1995, astronomers have raced to find ever smaller planets closer to our own world.

“The search for planets is the search for life,” Fischer said. “And at least for life as we know it, that means finding a planet similar to Earth.” Scientists believe Earth-like planets are the best place to look for life because they are the right size and orbit their host stars at the right distance to support liquid water, an essential ingredient for every form of life found on Earth.

Yet Fischer is quick to caution that, even with the exceptional data from the Kepler telescope, it will be extremely difficult to pick out the weak signal created by such a small planet as it dims its host star. “Planet Hunters is an experiment—we’re looking for the needle in the haystack,” she said.

Still, Galaxy Zoo proved that ordinary people can make extraordinary discoveries. Several Galaxy Zoo users were listed as co-authors on more than 20 published scientific papers that resulted from the citizen science project, most of whom had no prior knowledge of astronomy.

“The point of citizen science is to actively involve people in real research,” Schawinski said. “When you join Planet Hunters, you’re contributing to actual science—and you might just make a real discovery.”

(Photo: Haven Giguere/Yale)

Yale University

STUDY SHOWS CAFFEINE NEGATIVELY AFFECTS CHILDREN

0 comentarios
Caffeine consumption in children is often blamed for sleep problems and bedwetting. Information on childhood caffeine consumption is limited, and many parents may not know the amount or effects of their child's caffeine consumption. In a study published in The Journal of Pediatrics, researchers found that 75% of children surveyed consumed caffeine on a daily basis, and the more caffeine the children consumed, the less they slept.

Dr. William Warzak and colleagues from the University of Nebraska Medical Center surveyed the parents of over 200 children 5 to 12 years old during routine clinical visits at an urban pediatric clinic. Parents were asked to report the types and amounts of snacks and beverages their child consumed on a daily basis.

According to Dr. Warzak, "Some children as young as 5 years old were consuming the equivalent of a can of soda a day." The authors also noticed that the older children drank more caffeinated beverages. "Children between the ages of 8 and 12 years consumed an average of 109 mg a day," Dr. Warzak explains, "the equivalent of almost 3 12-ounce cans of soda."

Researchers found, however, that caffeine was not linked to bedwetting in these children. "Contrary to popular belief," Dr. Evans, coauthor and statistician, clarifies, "children were not more likely to wet the bed if they consumed caffeine, despite the fact that caffeine is a diuretic."

The study authors stress the importance of parental awareness regarding their child's caffeine consumption. "Parents should be aware of the potentially negative influence of caffeine on a child's sleep quality and daily functioning," Dr. Warzak asserts. The authors suggest that primary care pediatricians may be able to help by screening patients for caffeine consumption and educating parents about the potentially harmful effects of caffeine.

Elsevier Health Sciences

PLASMA THERAPY: AN ALTERNATIVE TO ANTIBIOTICS?

0 comentarios
Cold plasma jets could be a safe, effective alternative to antibiotics to treat multi-drug resistant infections, says a study published in the January issue of the Journal of Medical Microbiology on 15 December.

The team of Russian and German researchers showed that a ten-minute treatment with low-temperature plasma was not only able to kill drug-resistant bacteria causing wound infections in rats but also increased the rate of wound healing. The findings suggest that cold plasmas might be a promising method to treat chronic wound infections where other approaches fail.

The team from the Gamaleya Institute of Epidemiology and Microbiology in Moscow tested a low-temperature plasma torch against bacterial species including Pseudomonas aeruginosa and Staphylococcus aureus. These species are common culprits of chronic wound infections and are able to resist the action of antibiotics because they can grow together in protective layers called biofilms. The scientists showed not only that plasma was lethal to up to 99% of bacteria in laboratory-grown biofilms after five minutes, but also that plasma killed about 90 % of the bacteria (on average) infecting skin wounds in rats after ten minutes.

Plasmas are known as the fourth state of matter after solids, liquids and gases and are formed when high-energy processes strip atoms of their electrons to produce ionized gas flows at high temperature. They have an increasing number of technical and medical applications and hot plasmas are already used to disinfect surgical instruments.

Dr Svetlana Ermolaeva who conducted the research explained that the recent development of cold plasmas with temperatures of 35-40°C makes the technology an attractive option for treating infections. “Cold plasmas are able to kill bacteria by damaging microbial DNA and surface structures without being harmful to human tissues. Importantly we have shown that plasma is able to kill bacteria growing in biofilms in wounds, although thicker biofilms show some resistance to treatment.”

Plasma technology could eventually represent a better alternative to antibiotics, according to Dr Ermolaeva. “Our work demonstrates that plasma is effective against pathogenic bacteria with multiple-antibiotic resistance - not just in Petri dishes but in actual infected wounds,” she said. “Another huge advantage to plasma therapy is that it is non-specific, meaning it is much harder for bacteria to develop resistance. It’s a method that is contact free, painless and does not contribute to chemical contamination of the environment.”

Society for General Microbiology

BERING SEA WAS ICE-FREE AND FULL OF LIFE DURING LAST WARM PERIOD, STUDY FINDS

0 comentarios

Deep sediment cores retrieved from the Bering Sea floor indicate that the region was ice-free all year and biological productivity was high during the last major warm period in Earth's climate history.

Christina Ravelo, professor of ocean sciences at the University of California, Santa Cruz, presented the new findings in a talk on December 13 at the fall meeting of the American Geophysical Union (AGU) in San Francisco. Ravelo and co-chief scientist Kozo Takahashi of Kyushu University, Japan, led a nine-week expedition of the Integrated Ocean Drilling Program (IODP) to the Bering Sea last summer aboard the research vessel JOIDES Resolution. The researchers drilled down 700 meters through rock and sludge to retrieve sediments deposited during the Pliocene Warm Period, 3.5 to 4.5 million years ago.

"Evidence from the Pliocene Warm Period is relevant to studies of current climate change because it was the last time in our Earth's history when global temperatures were higher than today," Ravelo said.

Carbon dioxide levels during the Pliocene Warm Period were also comparable to levels today, and average temperatures were a few degrees higher, she said. Climate scientists are interested in what this period may tell us about the effects of global warming, particularly in the polar regions. Current observations show more rapid warming in the Arctic compared to other places on Earth and compared to what was expected based on global climate models.

Ravelo's team found evidence of similar amplified warming at the poles during the Pliocene Warm Period. Analysis of the sediment samples indicated that average sea surface temperatures in the Bering Sea were at least 5 degrees Celsius warmer than today, while average global temperatures were only 3 degrees warmer than today.

Samples from the expedition showed evidence of consistently high biological productivity in the Bering Sea throughout the past five million years. The sediments contain fossils of plankton, such as diatoms, that suggest a robust ecology of organisms persisting from the start of the Pliocene Warm Period to the present. In addition, samples from the Pliocene Warm Period include deep-water organisms that require more oxygenated conditions than exist today, suggesting that the mixing of water layers in the Bering Sea was greater than it is now, Ravelo said.

"We usually think of the ocean as being more stratified during warm periods, with less vertical movement in the water column," she said. "If the ocean was actually overturning more during a period when it was warmer than today, then we may need to change our thinking about ocean circulation."

Today, the Bering Sea is ice-free only during the summer, but the sediment samples indicate it was ice-free year-round during the Pliocene Warm Period. According to Ravelo, the samples showed no evidence of the pebbles and other debris that ice floes carry from the land out to sea and deposit on the seafloor as they melt. In addition, the researchers didn't find any of the microorganisms typically associated with sea ice, she said.

"The information we found tells us quite a bit about what things were like during the last period of global warming. It should benefit the scientists today who are sorting out how ocean circulation and conditions at the poles change as the Earth warms," Ravelo said.

The expedition led by Ravelo and Takahashi was part of an ongoing program conducted by the IODP with funding from the National Science Foundation and support from the United States, Japan, and the European Union. The JOIDES Resolution is the only ship operated by the United States capable of taking undisturbed core samples at the depths required to study conditions during the Pliocene Warm Period. The current program will end in 2013, and planning for the next phase of ocean drilling is now under way.

(Photo: Carlos Alvarez Zarikian, IODP/TAMU)

University of California, Santa Cruz

IAPETUS, SATURN'S STRANGE WALNUT MOON

0 comentarios
As space-based probes and telescopes reveal new and unimaginable features of our universe, a geological landmark on Saturn's moon Iapetus is among the most peculiar.

Images provided by NASA's Cassini spacecraft in 2005 reveal an almost straight-line equatorial mountain range that towers upwards of 12 miles and spreads as wide as 60 miles, encircling more than 75 percent of Iapetus, the ringed planet's third-largest moon, and causing it to resemble a walnut.

"There's nothing else like it in the solar system," said Andrew Dombard, associate professor of earth and environmental sciences at the University of Illinois at Chicago. "It's something we've never seen before and didn't expect to see."

Some scientists have hypothesized that Iapetus's mountains were formed by internal forces such as volcanism, but Dombard, along with Andrew Cheng, chief scientist in the space department at the Johns Hopkins University Applied Physics Laboratory, William McKinnon, professor of earth and planetary sciences at Washington University in St. Louis and Jonathan Kay, a UIC graduate student studying with Dombard, think the mountains resulted from icy debris raining down from a sub-satellite or mini-moon orbiting Iapetus, which burst into bits under tidal forces of the larger moon.

"Imagine all of these particles coming down horizontally across the equatorial surface at about 400 meters per second -- the speed of a rifle bullet, one after another, like frozen baseballs," said McKinnon. "At first the debris would have made holes to form a groove that eventually filled up."

Dombard and his collaborators think the phenomenon is the result of what planetary scientists call a giant impact, where crashing and coalescing debris during the solar system's formation more than 4 billion years ago created satellites such as the Earth's Moon and Pluto's largest satellite Charon.

They've done a preliminary analysis demonstrating the plausibility of impact formation and subsequent evolution of Iapetus's sub-satellite. Dombard said Iapetus is the solar system's moon with the largest "hill sphere" -- the zone surrounding a moon where the gravitational force is stronger than that of the planet it circles.

"It is the only moon far enough from its planet, and large enough relative to its planet, that a giant impact may be able to form a sub-satellite," said Cheng.

This lends plausibility to the rain of debris along the equator hypothesis, Dombard said, but he adds that more sophisticated computer modeling and analysis is planned in the coming years to back it up.

Other explanations have been proposed by scientists as to what caused this odd formation of mountains on Iapetus, but Dombard said they all have shortcomings.

"There are three critical observations that you need to explain," he said. "Why the mountains sit on the equator, why it's found only on the equator, and why only on Iapetus? Previous models address maybe one or two of those critical observations. We think we can explain all three."

University of Illinois at Chicago

UF STUDY PROVIDES NEW INSIGHT INTO ORIGIN, EVOLUTION OF FLOWERING PLANTS

0 comentarios
A new University of Florida study to be published online in the Proceedings of the National Academy of Sciences presents the deepest insight to the genes that made up the first flower, the common ancestor of all flowering plants, and how those genes have changed over time.

“Our survival depends on products we get from the flower — grains, fruits and many other materials,” said Doug Soltis, UF distinguished professor of biology and project co-investigator. “Crop improvement is so important, but you don’t understand how a flower is put together unless you have a reference point – you can’t modify what you can’t understand.”

After nearly 10 years of research funded by the National Science Foundation, scientists from the Florida Museum of Natural History, the UF department of biology, and the UF Genetics Institute are bringing the study to a close.

“There are 350,000 species of flowering plants (or angiosperms), and they serve as the foundation of nearly all of Earth’s ecosystems, yet we don’t know how the flower originated,” said Pam Soltis, UF distinguished professor, Florida Museum of Natural History curator and project co-investigator. “We now know the origin of many of the genes responsible for making a flower and how those genes have changed during the history of angiosperms.”

A 2009 UF study traced the origin of flowers using genetic data for the avocado (a representative of one of the early lineages of flowering plants) and a well-known plant in genetics research, Arabidopsis thaliana. The new study includes additional comparisons with a water lily, California poppy and cycad (a gymnosperm or non-flowering seed plant) and shows how the first flowers evolved from pre-existing genetic programs in gymnosperm cones.

“We have a much better understanding of the flower than we did 10 years ago and it’s a huge improvement,” Doug Soltis said. “We don’t know every pathway, but we have a much better handle on what makes those parts tick.”

Typical angiosperms have flowers with four organs: sepals (typically green), petals (typically colorful), stamens (male organs, which produce pollen) and carpels (female organs, which produce eggs). But in the earliest flowers, the distinct borders between their floral organs fade to a blur. The flowers of early angiosperms have organs that merge into each other – for example, a stamen of a water lily produces pollen but it may also be petal-like and colorful.

“Our study found that the floral organs of basal angiosperms merge not only in appearance, but also in their underlying genetic pathways,” Pam Soltis said. “During evolution, the timing and location of where these genes act have become restricted, ultimately producing flowers with separate and distinguishable flower parts.”

“These missing links are incredibly important,” Doug Soltis said. “They are our key to the past.”

The study was a collaboration of researchers at UF, The Pennsylvania State University, the University of Georgia, and the University at Buffalo. The first author on the paper is Andre Chanderbali, a postdoctoral student at the Florida Museum and the UF department of biology.

“Flowers are the defining feature of angiosperms, the dominant vegetation of our world,” said Stanford University biology professor Virginia Walbot. “The new PNAS article by Chanderbali et al. represents a breakthrough in understanding the origin and evolutionary trajectories of the separate male and female floral parts.”

University of Florida

Monday, December 27, 2010

GEOLOGIST'S DISCOVERIES RESOLVE DEBATE ABOUT OXYGEN IN EARTH'S MANTLE

0 comentarios
While there continues to be considerable debate among geologists about the availability of oxygen in the Earth's mantle, recent discoveries by a University of Rhode Island scientist are bringing resolution to the question.

Analysis of erupted rock from Agrigan volcano in the western Pacific near Guam found it to be highly oxidized as a result of its exposure to oxygen when it formed in the Earth's mantle. When, over millions of years, seafloor rocks are transported back into the Earth's mantle at subduction zones – sites on the seafloor where tectonic plates have collided, forcing one plate beneath the other – they deliver more oxygen into the mantle.

The results of the research was presented at a meeting of the American Geophysical Union in San Francisco.

"The cycling of oxygen at the Earth's surface is central to the life and activity that takes place at the surface, but it is equally essential in the Earth's mantle," said URI Assistant Professor Katherine Kelley. "The availability of oxygen to the mantle is in part controlled by the oxygen at the surface."

Kelley said that this discovery is important because the availability of oxygen to the mantle controls what minerals are found there, how certain elements behave, and what kind of gasses might be expelled from volcanoes.

"The most primitive samples of lava we can identify are the most oxidized," she said. "That oxidation comes off the subducted plate at depth in the mantle and makes its way into volcanic magma sources that then erupt."

According to Kelley, some scientists have argued that the availability of oxygen to the mantle hasn't changed since the Earth was formed. However, if plate tectonics carry this oxidized material into the mantle, as she has demonstrated, then it is adding oxygen to the mantle. It also suggests that what takes place at the surface of the Earth probably influences what happens deep beneath the surface as well.

At Brookhaven National Laboratory, Kelley analyzed tiny olivine crystals that contain naturally formed glass from the early histories of magmas, in which are found dissolved gases from volcanic eruptions. By analyzing the glass she determined the oxidation state of iron in rocks and related it to the dissolved gases, which are elevated in subduction zone magmas.

This work follows a related study by Kelley that found that material from subduction zones are more oxidized than material from mid-ocean ridges where the plates are pulling apart. That study was published in the journal Science in 2009.

"These are important processes to understand, but they are hard to get a clear picture of because they take place over such long periods of time," Kelley said. "It's one piece of the big puzzle of Earth's evolution and how it continues to change."

The University of Rhode Island

FIGHTER PILOTS' BRAINS ARE 'MORE SENSITIVE'

0 comentarios

Cognitive tests and MRI scans have shown significant differences in the brains of fighter pilots when compared to a control group, according to a new study led by scientists from UCL.

The study, published in the Journal of Neuroscience, compares the cognitive performance of 11 front-line RAF (Royal Air Force) Tornado fighter pilots to a control group of a similar IQ with no previous experience of piloting aircraft. All the participants completed two 'cognitive control' tasks which were used to investigate rapid decision making. Diffusion tensor imaging (DTI), a type of MRI brain scan, was then used to examine the structure of white matter connections between brain regions associated with cognitive control.

The researchers found that fighter pilots have superior cognitive control, showing significantly greater accuracy on one of the cognitive tasks, despite being more sensitive to irrelevant, distracting information. The MRI scans revealed differences between pilots and controls in the microstructure of white matter in the right hemisphere of the brain.

Senior author Professor Masud Husain, UCL Institute of Neurology and UCL Institute of Cognitive Neuroscience, said: "We were interested in the pilots because they're often operating at the limits of human cognitive capability – they are an expert group making precision choices at high speed.

"Our findings show that optimal cognitive control may surprisingly be mediated by enhanced responses to both relevant and irrelevant stimuli, and that such control is accompanied by structural alterations in the brain. This has implications beyond simple distinctions between fighter pilots and the rest of us because it suggests expertise in certain aspects of cognition are associated with changes in the connections between brain areas. So, it's not just that the relevant areas of the brain are larger – but that the connections between key areas are different. Whether people are born with these differences or develop them is currently not known."

The study tasks were designed to assess the influence of distracting information and the ability to update a response plan in the presence of conflicting visual information. In the first task, participants had to press a right or left arrow key in response to the direction of an arrow on a screen in front of them, which was flanked by other distracting arrows pointing in different directions. In the second task, they had to respond as quickly as possible to a 'go' signal, unless they were instructed to change their plan before they had even made a response.

The results of the first task showed that the expert pilots were more accurate than age-matched volunteers, with no significant difference in reaction time – so, the pilots were able to perform the task at the same speed but with significantly higher accuracy. In the second task, there was no significant difference between the pilots and volunteers, which the authors say suggests that expertise in cognitive control may be highly specialised, highly particular to specific tasks and not simply associated with overall enhanced performance.

These findings suggest that in humans some types of expert cognitive control may be mediated by enhanced response gain to both relevant and irrelevant stimuli, and is accompanied by structural alterations in the white matter of the brain.

(Photo: UCL)

UCL

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com