Monday, August 31, 2009


0 comentarios

Plastic that conducts electricity holds promise for cheaper, thinner and more flexible electronics. This technology is already available in some gadgets -- the new Sony walkman that was introduced earlier this summer and the Microsoft Zune HD music player released recently both incorporate organic light-emitting electronic displays. Until now, however, circuits built with organic materials have allowed only one type of charge to move through them. New research from the University of Washington makes charges flow both ways.

The cover article in an upcoming issue of the journal Advanced Materials describes an approach to organic electronics that allows transport of both positive and negative charges.

"The organic semiconductors developed over the past 20 years have one important drawback. It's very difficult to get electrons to move through," said lead author Samson Jenekhe, a UW professor of chemical engineering. "By now having polymer semiconductors that can transmit both positive and negative charges, it broadens the available approaches. This would certainly change the way we do things."

Co-authors are Felix Kim, a doctoral student working with Jenekhe, and graduate student Xugang Guo and assistant professor Mark Watson at the University of Kentucky. The research was funded by the National Science Foundation, the Department of Energy and the Ford Foundation.

Silicon Valley got its name for a reason: Silicon is the "workhorse" of today's electronics industry, Jenekhe said. Silicon is fairly expensive and requires costly manufacturing, however, and because its rigid crystal form does not allow flexible devices.

About 30 years ago it was discovered that some plastics, or polymers, can conduct electricity. Since then researchers have been working to make them more efficient. Organic, or carbon-based, electronics are now used in such things as laptop computers, car audio systems and mp3 players.

A major drawback with existing organic semiconductors is most transmit only positive charges (called "holes" because the moving areas of positive charge are actually places where an electron is missing). In the last decade a few organic materials have been developed that can transport only electrons. But making a working organic circuit has meant carefully layering two complicated patterns on top of one another, one that transports electrons and another one that transports holes.

"Because current organic semiconductors have this limitation, the way they're currently used has to compensate for that, which has led to all kinds of complex processes and complications," Jenekhe said.

For more than a decade Jenekhe's lab has been a leader in developing organic semiconductors that can transmit electrons. Over the past few years the group has created polymers with a donor and an acceptor part, and carefully adjusted the strength of each one. In collaboration with Watson's lab, they have now developed an organic molecule that works to transport both positive and negative charges.

"What we have shown in this paper is that you don't have to use two separate organic semiconductors," Jenekhe said. "You can use one material to create electronic circuits."

The material would allow organic transistors and other information-processing devices to be built more simply, in a way that is more similar to how inorganic circuits are now made.

The group used the new material to build a transistor designed in the same way as a silicon model and the results show that both electrons and holes move through the device quickly.

The results represent the best performance ever seen in a single-component organic polymer semiconductor, Jenekhe said. Electrons moved five to eight times faster through the UW device than in any other such polymer transistor. A circuit, which consists of two or more integrated devices, generated a voltage gain two to five times greater than previously seen in a polymer circuit.

"We expect people to use this approach," Jenekhe said. "We've opened the way for people to know how to do it."

(Photo: University of Washington)


0 comentarios
Researchers at the University of Oklahoma Health Sciences Center have shown the first link between a newly discovered anti-aging gene and high blood pressure. The results, which appear this month in the journal Hypertension, offer new clues on how we age and how we might live longer.

Persistent hypertension, or high blood pressure, is a risk factor for stroke, heart attack, heart failure, arterial aneurysm and is the leading cause of chronic kidney failure. Even a modest elevation of arterial blood pressure leads to shortened life expectancy.

Researchers, led by principal investigator Zhongjie Sun, tested the effect of an anti-aging gene called klotho on reducing hypertension. They found that by increasing the expression of the gene in laboratory models, they not only stopped blood pressure from continuing to rise, but succeeded in lowering it. Perhaps most impressive was the complete reversal of kidney damage, which is associated with prolonged high blood pressure and often leads to kidney failure.

“One single injection of the klotho gene can reduce hypertension for at least 12 weeks and possibly longer. Klotho is also available as a protein and, conceivably, we could ingest it as a powder much like we do with protein drinks,” said Sun, M.D., Ph.D., a cardiovascular expert at the OU College of Medicine.

Scientists have been working with the klotho gene and its link to aging since 1997 when it was discovered by Japanese scientists. This is the first study showing that a decline in klotho protein level may be involved in the progression of hypertension and kidney damage, Sun said. With age, the klotho level decreases while the prevalence of hypertension increases.

Researchers used one injection of the klotho gene in hypertensive research models and were able to markedly reduce blood pressure by the second week. It continued to decline steadily for the length of the project – 12 weeks. The klotho gene was delivered with a safe viral vector that is currently used for gene therapy. The virus is already approved by the U.S. Food and Drug Administration for use in humans.

Researchers are studying the gene’s effect for longer periods to test its ability to return blood pressure levels to normal. They also are looking at whether klotho can prevent hypertension.

University of Oklahoma Health Sciences Center


0 comentarios

It's been a mystery: how can our teeth withstand such an enormous amount of pressure, over many years, when tooth enamel is only about as strong as glass? A new study by Prof. Herzl Chai of Tel Aviv University's School of Mechanical Engineering and his colleagues at the National Institute of Standards and Technology and George Washington University gives the answer.

The researchers applied varying degrees of mechanical pressure to hundreds of extracted teeth, and studied what occurred on the surface and deep inside them. The study, published in the May 5, 2009, issue of the Proceedings of the National Academy of Science, shows that it is the highly-sophisticated structure of our teeth that keeps them in one piece — and that structure holds promising clues for aerospace engineers as they build the aircraft and space vehicles of the future.

"Teeth are made from an extremely sophisticated composite material which reacts in an extraordinary way under pressure," says Prof. Chai. "Teeth exhibit graded mechanical properties and a cathedral-like geometry, and over time they develop a network of micro-cracks which help diffuse stress. This, and the tooth's built-in ability to heal the micro-cracks over time, prevents it from fracturing into large pieces when we eat hard food, like nuts."

The automotive and aviation industries already use sophisticated materials to prevent break-up on impact. For example, airplane bodies are made from composite materials — layers of glass or carbon fibers — held together by a brittle matrix.

In teeth, though, fibers aren't arranged in a grid, but are "wavy" in structure. There are hierarchies of fibers and matrices arranged in several layers, unlike the single-thickness layers used in aircrafts. Under mechanical pressure, this architecture presents no clear path for the release of stress. Therefore, "tufts" — built-in micro cracks — absorb pressure in unison to prevent splits and major fractures. As Prof. Chai puts it, tooth fractures "have a hard time deciding which way to go," making the tooth more resistant to cracking apart. Harnessing this property could lead to a new generation of much stronger composites for planes.

Prof. Chai, himself an aerospace engineer, suggests that if engineers can incorporate tooth enamel's wavy hierarchy, micro-cracking mechanism, and capacity to heal, lighter and stronger aircraft and space vehicles can be developed. And while creating a self-healing airplane is far in the future, this significant research on the composite structure of teeth can already begin to inspire aerospace engineers — and, of course, dentists.

Dental specialists looking for new ways to engineer that picture-perfect Hollywood smile can use Dr. Chai's basic research to help invent stronger crowns, better able to withstand oral wear-and-tear. "They can create smart materials that mimic the properties found in real teeth," he says.

In natural teeth, there may not be any way to speed up the self-healing ability of tooth enamel, which the Tel Aviv University research found is accomplished by a glue-like substance that fills in micro-cracks over time. But fluoride treatments and healthy brushing habits can help to fill in the tiny cracks and keep teeth strong.

(Photo: TAU)

Tel Aviv University

Friday, August 28, 2009


0 comentarios
Organisms are genetically programmed to cheat the system and have to be policed to stop them putting their needs ahead of society and thus threatening its survival, say scientists.

University of Manchester researchers have shown that even the most-simple organisms have complex social behaviours. Dr Chris Thompson and Dr Jason Wolf’s study of slime moulds has shown that these microscopic organisms – which share many of their genes with humans – respond to competition, trying to get the upper hand with a variety of strategies including cheating. However the shift in behaviour is extremely complex. Individuals can cheat by promoting their own self interest or can coerce others to perform the altruistic act. Ultimately this balance may mean the species – or society – survives.

By illuminating general principles of how organisms cheat, their study could help us understand what drives – and what limits – selfish behaviour such as MPs fiddling their expenses.

Dr Thompson, of the Faculty of Life Sciences, explains: “Using slime mould allows us to look at social behaviour in its most basic form. They are single cell organisms that just divide; there is no experience, their social behaviour is simply genetically controlled.

“However they do work together and we have now shown for the first time they do have a complex social life that involves both cheating and coercion, which ensures the survival of the species. We are now working to identify the genes behind this.

“Since humans share many of the same genes, they will behave the same way.”

The paper, published in the latest Current Biology (23 July 2009) is the latest in a series that asks: why are organisms social? Why do they cooperate with one another when, according to natural selection, they should not do that? They should be fighting to get ahead.

Dr Thompson adds: “It was one of Darwin’s biggest challenges. If individuals did cheat and put themselves first all the time, the species would collapse.”

In slime mould, some amoeba make spores – thus gaining the reproductive advantage – while others make stalks and die. Making stalks is an altruistic act. So why do some make stalks, even though they do not enjoy the reproductive advantage? The trouble is that if everyone cheats, there would be no stalk, and everyone would suffer because fitness will be reduced.

Dr Thompson says: “This latest paper looks at whether organisms are cheating or just choosing the best strategy. If you use the analogy of two men in a sinking boat, with one man bailing more slowly than the other, it may be that he is cheating and allowing the other to do most of the work. Or it may be that he has a better or equally good strategy as bailing slower allows him to conserve energy and actually bail for a longer time.

“We looked at how slime moulds behaved when alone and found some were making more spores. So they were not cheating after all, they were simply following their chosen strategy.

“However we then looked at how these slime moulds behaved when they were mixed with others and found that they recognised that they were mixed with foreigners and changed their strategy: they did respond to competition.

“It is amazing how complex their ability is to recognise foreigners and shift their behaviour. Sometimes if one is making more spores then the other will make more spores in what we term self promotion. But if everyone did this, then over time you end up with no stalks – everyone is trying to make themselves better and better and better until it becomes spiteful and bloody minded. If everyone is making more spores and no stalks then the system collapses. You need policing or coercion to stop that happening. Somehow some cells are forced to make stalks.

“Now we want to know how organisms recognise foreigners and how they then force others to do something that benefits the species more than themselves.”

He adds: “Working with slime mould is fantastic. It allows us to look at social behaviour in its most basic form. We can us this to understand how organisms work together and form colonies. For example, with tooth decay is caused by colony forming bacteria, and organisms form biofilms and secrete group products to protect against antibiotics. So our findings have a wide application from the practical – why it can be difficult to stop tooth decay – to bigger issues such as evolution on the planet as we know it.

“People might wonder why bother studying slime mould but it could lead to a greater understanding of human behaviour. We know that human behaviour, at least in part, is influenced by our genes, so studying behaviour at a cellular level can improve our understanding of why some genes are associated with cooperation and others with conflict. Cooperation is a major driving force in evolution and understanding it is a huge challenge in biology. In society, people help each other; they work together within a social structure for a common good even if that means individual effort or sacrifice. I'm interested in finding out what keeps things fair and how cooperation is stabilized in the face of selfish cheats.”

University of Manchester


0 comentarios

Bacterial infection is a major health threat to patients with severe burns and other kinds of serious wounds such as traumatic bone fractures. Recent studies have identified an important new weapon for fighting infection and healing wounds: insulin.

Now, using tiny nanodiamonds, researchers at Northwestern University have demonstrated an innovative method for delivering and releasing the curative hormone at a specific location over a period of time. The nanodiamond-insulin clusters hold promise for wound-healing applications and could be integrated into gels, ointments, bandages or suture materials.

Localized release of a therapeutic is a major challenge in biomedicine. The Northwestern method takes advantage of a condition typically found at a wound site -- skin pH levels can reach very basic levels during the repair and healing process. The researchers found that the insulin, bound firmly to the tiny carbon-based nanodiamonds, is released when it encounters basic pH levels, similar to those commonly observed in bacterially infected wounds. These basic pH levels are significantly greater than the physiological pH level of 7.4.

The results of the study were published online July 26 by the journal Biomaterials.

“This study introduces the concept of nanodiamond-mediated release of therapeutic proteins,” said Dean Ho, assistant professor of biomedical engineering and mechanical engineering at the McCormick School of Engineering and Applied Science. Ho led the research. “It’s a tricky problem because proteins, even small ones like insulin, bind so well to the nanodiamonds. But, in this case, the right pH level effectively triggers the release of the insulin.”

A substantial amount of insulin can be loaded onto the nanodiamonds, which have a high surface area. The nanodiamond-insulin clusters, by releasing insulin in alkaline wound areas, could accelerate the healing process and decrease the incidence of infection. Ho says this ability to release therapeutics from the nanodiamonds on demand represents an exciting strategy towards enhancing the specificity of wound treatment.

In their studies, Ho and his colleagues showed that the insulin was very tightly bound to the nanodiamonds when in an aqueous solution near the normal physiological pH level. Measurements of insulin function revealed that the protein was virtually inactive when bound to the nanodiamonds -- a beneficial property for preventing excess or unnecessary drug release.

Upon increasing the pH to the basic levels commonly observed in the skin during severe burns, the researchers confirmed the insulin was released from the nanodiamond clusters and retained its function. Exploiting this pH-mediated release mechanism may provide unique advantages for enhanced drug delivery methods.

The researchers also found the insulin slowly and consistently released from the nanodiamond clusters over a period of several days.

Insulin accelerates wound healing by acting as a growth hormone. It encourages skin cells to proliferate and divide, restores blood flow to the wound, suppresses inflammation and fights infection. Earlier investigations have confirmed an increase in alkalinity of wound tissue, due to bacterial colonization, to levels as high as pH 10.5, the pH level that promoted insulin release from the nanodiamonds in the Northwestern study.

Ho’s group next will work on integrating the nanodiamond-insulin complexes into a gel and conducting preclinical studies. The researchers also will investigate different areas of medicine in which the nanodiamond-insulin clusters could be used.

Nanodiamonds have many advantages for biomedical applications. The large surface area allows a large amount of therapeutic to be loaded onto the particles. They can be functionalized with nearly any type of therapeutic, including small molecules, proteins and antibodies. They can be suspended easily in water, an important property in biomedicine. The nanodiamonds, each being four to six nanometers in diameter, are minimally invasive to cells, biocompatible and do not cause inflammation, a serious complication. And they are very scalable and can be produced in large quantities in uniform sizes.

By harnessing the unique surface properties of the nanodiamonds, Ho and his colleagues have demonstrated that the nanodiamonds serve as platforms that can successfully bind, deliver and release several classes of therapeutics, which could impact a broad range of medical needs.

Ho’s research group also has studied nanodiamonds for applications in cancer therapy. They demonstrated that nanodiamonds are capable of releasing the chemotherapy agent Doxorubicin in a sustained and consistent manner. (Ho is a member of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University.)

In addition to using the nanodiamonds in their particle form, Ho’s group has developed devices that harness the slow drug-release capabilities of the nanodiamonds. More recently, his team has shown that nanodiamonds are effective in dispersing insoluble drugs in water, boosting their potential for broader applications in medicine.

(Photo: Andrew Campbell)

Northwestern University


0 comentarios

Scientists at the Universities of Liverpool and Birmingham have found that orang-utans move through the canopy of tropical forests in a completely different way to all other tree-dwelling primates.

Movement through a complex meshwork of small branches at the heights of tropical forests presents a unique challenge to animals wanting to forage for food safely. It can be particularly dangerous for large animals where a fall of up to 30m could be fatal. Scientists found that dangerous tree vibrations can be countered by the orang-utan’s ability to move with an irregular rhythm.

Professor Robin Crompton, from the University of Liverpool’s School of Biomedical Sciences, explained that these challenges were similar to the difficulties engineers encountered with London’s ‘wobbly’ Millennium Bridge: “The problems with the Millennium Bridge were caused by large numbers of people walking in sync with the slight sideways motion of the bridge. This regular pattern of movement made the swaying motion of the bridge even worse. We see a similar problem in the movement of animals through the canopy of tropical forests, where there are highly flexible branches.

“Most animals, such as the chimpanzee, respond to these challenges by flexing their limbs to bring their body closer to the branch. Orang-utans, however, are the largest arboreal mammal and so they are likely to face more severe difficulties due to weight. If they move in a regular fashion, like their smaller relatives, we get a ‘wobbly bridge’ situation, whereby the movement of the branches increases.”

Dr Susannah Thorpe, from the University of Birmingham’s School of Biosciences, added: “Orang-utans have developed a unique way of coping with these problems; they move in an irregular way which includes upright walking, four-limbed suspension from branches and tree-swaying, whereby they move branches backwards and forwards, with increasing magnitude, until they are able to cross large gaps between trees.”

The team studied orang-utans in Sumatra, where the animal is predicted to be the first great ape to become extinct. This new research could further understanding into the way orang-utans use their habitat, which could support new conservation programmes.

Dr Thorpe continued: “If the destruction of forest land does not slow down, the Sumatran orang-utan could be extinct within the next decade. Now that we know more about how they move through the trees and the unique way that they adapt to challenges in their environment we can better understand their needs. This could help with reintroducing rescued animals to the forests and efforts to conserve their environment.”

(Photo: U. Liverpool)

University of Liverpool


0 comentarios
A new study has concluded that one key part of the immune system, the ability of vitamin D to regulate anti-bactericidal proteins, is so important that is has been conserved through almost 60 million years of evolution and is shared only by primates, including humans – but no other known animal species.

The fact that this vitamin-D mediated immune response has been retained through millions of years of evolutionary selection, and is still found in species ranging from squirrel monkeys to baboons and humans, suggests that it must be critical to their survival, researchers say.

Even though the "cathelicidin antimicrobial peptide" has several different biological activities in addition to killing pathogens, it's not clear which one, or combination of them, makes vitamin D so essential to its regulation.

The research also provides further evidence of the biological importance of adequate levels of vitamin D in humans and other primates, even as some studies and experts suggest that more than 50 percent of the children and adults in the U.S. are deficient in "the sunshine vitamin."

"The existence and importance of this part of our immune response makes it clear that humans and other primates need to maintain sufficient levels of vitamin D," said Adrian Gombart, an associate professor of biochemistry and a principal investigator with the Linus Pauling Institute at Oregon State University.

In a new study in the journal BMC Genomics, researchers from OSU and the Cedars-Sinai Medical Center describe the presence of a genetic element that's specific to primates and involved in the innate immune response. They found it not only in humans and their more recent primate ancestors, such as chimpanzees, but also primates that split off on the evolutionary tree tens of millions of years ago, such as old world and new world primates.

The genetic material – called an Alu short interspersed element – is part of what used to be thought of as "junk DNA" and makes up more than 90 percent of the human genome. That genetic material, however, is now understood to often play important roles in regulating and "turning on" the expression of other genes.

In this case, the genetic element is believed to play a major role in the proper function of the "innate" immune system in primates – an ancient, first line of defense against bacteria, viruses and other pathogens, in which the body recognizes something that probably doesn't belong there, even though the specific pathogen may never have been encountered before.

"Many people are familiar with the role of our adaptive immune system, which is what happens when we mount a defense against a new invader and then retain antibodies and immunity in the future," Gombart said. "That's what makes a vaccine work. But also very important is the innate immune system, the almost immediate reaction your body has, for instance, when you get a cut or a skin infection."

In primates, this action of "turning on" an optimal response to microbial attack only works properly in the presence of adequate vitamin D, which is actually a type of hormone that circulates in the blood and signals to cells through a receptor. Vitamin D is produced in large amounts as a result of sun exposure, and is available in much smaller amounts from dietary sources.

Vitamin D prevents the "adaptive" immune response from over-reacting and reduces inflammation, and appears to suppress the immune response. However, the function of the new genetic element this research explored allows vitamin D to boost the innate immune response by turning on an antimicrobial protein. The overall effect may help to prevent the immune system from overreacting.

"It's essential that we have both an innate immune response that provides an immediate and front line of defense, but we also have protection against an overreaction by the immune system, which is what you see in sepsis and some autoimmune or degenerative diseases," Gombart said. "This is a very delicate balancing act, and without sufficient levels of vitamin D you may not have an optimal response with either aspect of the immune system."

After years of research, scientists are continuing to find new roles that vitamin D plays in the human body. It can regulate the actions of genes that are important to bone health, calcium uptake, and inhibition of cell growth. It helps regulate cell differentiation and, of course, immune function.

"The antimicrobial peptide that we're studying seems to be involved not just in killing bacteria, but has other biological roles," Gombart said. "It recruits other immune cells and sort of sounds the alarm that something is wrong. It helps promote development of blood vessels, cell growth and healing of wounds. And it seems to have important roles in barrier tissues such as skin and the digestive system. Vitamin D is very important for the health of the skin and digestive system, and putting the cathelicidin antimicrobial peptide gene under its regulation may be important in this function."

Any one, or some combination of those biological roles may be why vitamin D-mediated regulation of the antimicrobial peptide has been conserved in every primate species ever examined for its presence, researchers said, and did not disappear long ago through evolutionary variation and mutation. The evolution of primates into many different families and hundreds of species has been carefully tracked through genetic, molecular sequence and fossil studies, but the presence of this regulatory element in primates is still largely the same as it's been for more than 50 million years.

The evolutionary survival of this genetic element and the placement of the cathelicidin antimicrobial peptide gene under the regulation of the vitamin D pathway "may enable suppression of inflammation while potentiating innate immunity, thus maximizing the overall immune response to a pathogen and minimizing damage to the host," the researchers wrote in their conclusion.

Vitamin D deficiency is an issue of growing concern among many scientists, due to changing lifestyle or cultural trends in which many people around the world get less sun exposure and often inadequate dietary levels of the vitamin. It's a special problem with the elderly, which often have reduced exposure to sunlight and less ability to produce vitamin D in their skin – and at least partly as a result, are more susceptible to bone fractures, chronic inflammation and infectious disease.

Oregon State University


0 comentarios

When Planet Earth was just cooling down from its fiery creation, the sun was faint and young. So faint that it should not have been able to keep the oceans of earth from freezing. But fortunately for the creation of life, water was kept liquid on our young planet. For years scientists have debated what could have kept earth warm enough to prevent the oceans from freezing solid. Now a team of researchers from Tokyo Institute of Technology and University of Copenhagen's Department of Chemistry have coaxed an explanation out of ancient rocks, as reported in this week's issue of PNAS.

"The young sun was approximately 30 percent weaker than it is now, and the only way to prevent earth from turning into a massive snowball was a healthy helping of greenhouse gas," Associate Professor Matthew S. Johnson of the Department of Chemistry explains. And he has found the most likely candidate for an archean atmospheric blanket. Carbonyl Sulphide: A product of the sulphur disgorged during millennia of volcanic activity.

"Carbonyl Sulphide is and was the perfect greenhouse gas. Much better than Carbon Dioxide. We estimate that a blanket of Carbonyl Sulphide would have provided about 30 percent extra energy to the surface of the planet. And that would have compensated for what was lacking from the sun", says Professor Johnson.

To discover what could have helped the faint young sun warm early earth, Professor Johnson and his colleagues in Tokyo examined the ratio of sulphur isotopes in ancient rocks. And what they saw was a strange signal; A mix of isotopes that couldn't very well have come from geological processes.

"There is really no process in the rocky mantle of earth that would explain this distribution of isotopes. You would need something happening in the atmosphere," says Johnson. The question was what. Painstaking experimentation helped them find a likely atmospheric process. By irradiating sulphur dioxide with different wavelengths of sunlight, they observed that sunlight passing through Carbonyl Sulphide gave them the wavelengths that produced the weird isotope mix.

"Shielding by Carbonyl Sulphide is really a pretty obvious candidate once you think about it, but until we looked, everyone had missed it," says Professor Johnson, and he continues.

"What we found is really an archaic analogue to the current ozone layer. A layer that protects us from ultraviolet radiation. But unlike ozone, Carbonyl Sulphide would also have kept the planet warm. The only problem is: It didn't stay warm".

As life emerged on earth it produced increasing amounts of oxygen. With an increasingly oxidizing atmosphere, the sulphur emitted by volcanoes was no longer converted to Carbonyl Sulphide. Instead it got converted to sulphate aerosols: A powerful climate coolant. Johnson and his co-workers created a Computer model of the ancient atmosphere. And the models in conjunction with laboratory experiments suggest that the fall in levels of Carbonyl Sulphide and rise of sulphate aerosols taken together would have been responsible for creating snowball earth, the planetwide ice-age hypothesised to have taken place near the end of the Archean eon 2500 million years ago. And the implications to Johnson are alarming:

"Our research indicates that the distribution and composition of atmospheric gasses swung the planet from a state of life supporting warmth to a planet-wide ice-age spanning millions of years. I can think of no better reason to be extremely cautious about the amounts of greenhouse gasses we are currently emitting to the atmosphere".

(Photo: U. Copenhagen)


0 comentarios

In a new approach to an effective "electronic tongue" that mimics human taste, scientists in Illinois are reporting development of a small, inexpensive, lab-on-a-chip sensor that quickly and accurately identifies sweetness — one of the five primary tastes. It can identify with 100 percent accuracy the full sweep of natural and artificial sweet substances, including 14 common sweeteners, using easy-to-read color markers.

This sensory "sweet-tooth" shows special promise as a simple quality control test that food processors can use to ensure that soda pop, beer, and other beverages taste great, — with a consistent, predictable flavor. Their study was described at the American Chemical Society's 238th National Meeting.

The new sensor, which is about the size of a business card, can also identify sweeteners used in solid foods such as cakes, cookies, and chewing gum. In the future, doctors and scientists could use modified versions of the sensor for a wide variety of other chemical-sensing applications ranging from monitoring blood glucose levels in people with diabetes to identifying toxic substances in the environment, the researchers say.

"We take things that smell or taste and convert their chemical properties into a visual image," says study leader Kenneth Suslick, Ph.D., of the University of Illinois at Urbana-Champaign. "This is the first practical "electronic tongue" sensor that you can simply dip into a sample and identify the source of sweetness based on its color."

Researchers have tried for years to develop "electronic tongues" or "electronic noses" that rival or even surpass the sensitivity of the human tongue and nose. But these devices can generally have difficulty distinguishing one chemical flavor from another, particularly in a complex mixture. Those drawbacks limit the practical applications of prior technology.

Suslick's team has spent a decade developing "colorimetric sensor arrays" that may fit the bill. The "lab-on-a-chip" consists of a tough, glass-like container with 16 to 36 tiny printed dye spots, each the diameter of a pencil lead. The chemicals in each spot react with sweet substances in a way that produces a color change. The colors vary with the type of sweetener present, and their intensity varies with the amount of sweetener.

To the scientists' delight, the sensor identified 14 different natural and artificial sweeteners, including sucrose (table sugar), xylitol (used in sugarless chewing gum), sorbitol, aspartame, and saccharin with 100 percent accuracy in 80 different trials.

Many food processors use a test called high-pressure liquid chromatography to measure sweeteners for quality control. But it requires an instrument the size of a desk that costs tens of thousands of dollars and needs a highly trained technician to operate. The process is also relatively slow, taking up to 30 minutes. The new sensor, in contrast, is small, inexpensive, disposable, and produces results in about 2 minutes.

Those minutes can be critical. Suclick noted that the food and beverage industry takes great care to ensure consistent quality of the many products that use sweeteners. At present, when a product's taste falls below specifications, then samples must be taken to the lab for analysis. Meanwhile, the assembly lines continue to whirl, with thousands of packages moving along each minute.

"With this device, manufacturers can fix the problem immediately — on location and in real-time," Suslick says.

Christopher Musto, a doctoral student in Suslick's lab, says it will take more work to develop the technology into a complete electronic tongue. "To be considered a true electronic tongue, the device must detect not just sweet, but sour, salty, bitter, and umami — the five main human tastes," he says. Umami means meaty or savory.

(Photo: Kenneth Suslick, Ph.D., University of Illinois at Urbana-Champaign)
American Chemical Society


0 comentarios

After searching for more than 50 years, scientists finally have discovered a number of new mosquito repellents that beat DEET, the gold standard for warding off those pesky, sometimes disease-carrying insects. The stuff seems like a dream come true. It makes mosquitoes buzz off three times longer than DEET, the active ingredient in many of today's bug repellents. It does not have the unpleasant odor of DEET. And it does not cause DEET's sticky-skin sensation.

But there's a fly in the ointment: The odds may be stacked against any of the new repellents finding a place on store shelves this year or next — or ever.

Ulrich Bernier, Ph.D., lead researcher for the repellent study, said the costly, time-consuming pre-market testing and approval process is a hurdle that will delay availability of the repellents, which were discovered last year. The results of his team's work were presented at the 238th National Meeting of the American Chemical Society (ACS) by Maia Tsikoli, Ph.D., a post-doctoral researcher working with Bernier.

"Commercial availability of topical repellents can take years and a significant investment to achieve that end goal," Bernier said. "The cost will be several hundred thousand dollars. Once you determine that the repellent works through some screening process, we then have to go through a toxicological hazard evaluation involving numerous toxicological tests."

Provided the repellents continue to work well when tested in the laboratory on human skin, and if they pass the battery of toxicological tests, they would still face a series of tests to prove their effectiveness in making mosquitoes bug off, Bernier said.

"Clearly, the odds are stacked against new repellent products making it to market," he noted.

Bernier and his team discovered the repellents with what they say is the first successful application of a computer model using the molecular structures of more than 30,000 chemical compounds tested as repellents over the last 60 years. Using 11 known compounds, they synthesized 23 new ones. Of those, 10 gave about 40 days protection, compared to 17.5 days for DEET, when a soaked cloth was worn by a human volunteer. When applied to the skin, however, DEET lasts about five hours.

Bernier routinely participates in repellency studies, which involve about 500 mosquitoes trying to land on his arm and bite through a repellent-soaked cloth. "If the mosquitoes don't even land, we know the repellent is surely working," he explained. "If they walk around on the cloth-covered-arm, they are on the verge of being repelled. If they bite… on to the next repellent."

Overall, in addition to lasting longer than current products, including DEET, the new repellents don't have the stickiness or unpleasant smell common with today's insect sprays and liquids, said Bernier. He said that extended studies are now evaluating the effectiveness of the repellents against flies and ticks.

"This was quite an ambitious project," Bernier said. "The USDA historical archives and repellents database we used consisted of more than 30,000 chemical structures tested over the past six decades."

To search for the best repellents, the team devised software that recognized structural features of a chemical that would make it effective in keeping the bugs away. They trained it by feeding it the molecular structures of 150 known repellents. Based on this information, the program learned to identify the chemical traits of a good repellent without the chemists even having to know what those traits were. For example, the team checked out 2,000 variants of a compound found in black pepper that repels insects.

(Photo: Greg Allen, U.S. Department of Agriculture, Agricultural Research Service)

American Chemical Society


0 comentarios

Scientists have discovered the first gene involved in regulating the optimal length of human sleep, offering a window into a key aspect of slumber, an enigmatic phenomenon that is critical to human physical and mental health.

The team, reporting in the Aug. 14, 2009 issue of Science, identified a mutated gene that allows two members of an extended family to thrive on six hours of sleep a day rather than the eight to eight-and-a-half hours that studies have shown humans need over time to maintain optimal health. Working from this discovery, the scientists genetically engineered mice and fruit flies to express the mutated gene and study its impact.

While most Americans obtain less than eight hours of sleep a night (the average on non-work days is 7.4 hours), and some may feel they succeed with less when engaged in exhilarating work, domestic life or recreation, scientific evidence indicates that, over time, the body suffers from this regimen, the researchers say.

"Short term and chronic disruptions in the length of optimal sleep can have serious consequences on cognition, mood and physical health, including cancer and endocrine function," says the senior author of the study, Ying-Hui Fu, PhD, UCSF professor of neurology. However, teasing out this impact can be challenging, she says, given access to such stimuli as coffee and chocolate.

The finding, she says, offers an opportunity to unravel the regulatory mechanism of sleep. While the mutation may be rare, it could offer a probe more generally into the regulatory mechanisms of sleep quality and quantity. Understanding these mechanisms could lead to interventions to alleviate pathologies associated with sleep disturbance.

Sleep remains a relatively inscrutable biological phenomenon. Scientists know that it is regulated in large part by two processes: 1) circadian rhythms -- genetic, biochemical and physiological mechanisms that wax and wane during a 24 hour period to regulate the timing of sleep, 2) and homeostasis – unknown mechanisms that ensure that the body acquires over time the necessary amount of sleep, nudging it toward sleep when it has been deprived, prompting it out of sleep when it has received enough. This regulation of sleep intensity is measured in non rapid eye movement sleep and REM sleep. Interactions between the circadian rhythms and homeostatic mechanisms influence the timing, duration and quality of sleep and wakefulness.

But "the details in the process are really completely unknown," says Fu.

In 2001, the team discovered a mutated gene that caused some members of several families to be "morning larks," awaking around 3:30 a.m. and going to bed around 7:30 p.m. The condition, which the researchers named "familial advanced sleep phase syndrome," is believed to be primarily a variant, or mutated, form of a gene involved in regulating circadian rhythms. The total daily sleep time in people with this condition is normal.

In the current study, the team identified a small extended family in which a mother and her adult daughter had life-long shorter daily sleep requirements than most individuals. Fu's lab then studied blood samples from these women and their extended family. They identified a mutation in a gene known as hDEC2, which is a transcription factor that represses expression of certain other genes and is implicated in the regulation of circadian rhythms.

Next, the team genetically engineered mice and fruit flies to express the mutated human gene, and Ying He, PhD, a postdoctoral fellow in the Fu lab, studied its impact on their behavior and sleep patterns. Mice slept less, as seen in the extent of their scampering about in the dark (mouse preference) over the course of 24 hours and in electroencephalography (EEG) and electromyography (EMG) measurements indicating reduced nonREM and REM sleep. While lacking a Lilliputian size EEG to monitor the fruit flies, He studied the miniscule creatures' activity and sleep patterns by tracking the frequency of their movements through infrared light.

Next, the team compared the response of the genetically engineered mice and normal mice to the consequence of six hours of sleep deprivation. The engineered mice needed to compensate for their lost sleep to a much lesser extent – as seen in nonREM and REM measures – than their normal counterparts.

"These changes in sleep homeostasis in the mutant mice could provide an explanation for why human subjects with the mutation are able to live unaffected by shorter amounts of sleep throughout their lives," says Fu.

The next step, she says, is determining the DEC2's precise role. "We know the gene encodes a protein that is a transcriptional repressor and we know it makes the repressor's activity weaker. But we don't know if the weaker repressor is directly related to the shorter amount of sleep, because proteins can have many functions. It could be the protein functions as part of a larger transcriptional machinery, not necessarily as a repressor."

DEC2 could be involved in modulating "sleep quantity" alone, or it could be mediating both "sleep quantity" and "wakefulness-behavioral drive," according to Fu. The latter drive, she says, is critical for the procurement of food, shelter, and mates and could be more potent in individuals with this mutation.

"The mouse model also provides an opportunity to investigate whether there are other behaviors or physiological conditions associated with a short sleep syndrome," says Fu. She suspects there will be.

(Photo: UCSF)



0 comentarios
Walking outdoors in the fall, the splendidly colorful leaves adorning the trees are a delight to the eye. In Europe these autumn leaves are mostly yellow, while the United States and East Asia boast lustrous red foliage. But why is it that there are such differences in autumnal hues around the world?

A new theory provided by Prof. Simcha Lev-Yadun of the Department of Science Education- Biology at the University of Haifa-Oranim and Prof. Jarmo Holopainen of the University of Kuopio in Finland and published in the Journal New Phytologist proposes taking a step 35 million years back to solve the color mystery.

The green of a tree's leaves is from the larger proportion of the chlorophyll pigment in the leaves. The change in color to red or yellow as autumn approaches is not the result of the leaves' dying, but of a series of processes – which differ between the red and yellow autumn leaves. When the green chlorophyll in leaves diminishes, the yellow pigments that already exist become dominant and give their color to the leaves. Red autumn leaves result from a different process: As the chlorophyll diminishes, a red pigment, anthocyanin, which was not previously present, is produced in the leaf. These facts were only recently discovered and led to a surge of research studies attempting to explain why trees expend resources on creating red pigments just as they are about to shed their leaves.

Explanations that have been offered vary and there is no agreement on this as of yet. One discipline suggests that the red pigment is produced as a result of physiological functions that make the re-translocation of amino acids to the woody parts of the tree more efficient in setting up its protection against the potential damage of light and cold. Other explanations suggest that the red pigment is produced as part of the tree's strategy for protecting itself against insects that thrive on the flow of amino acids. But whatever the answer is, these explanations do not help us understand why the process of creating anthocyanin, the red pigment, does not occur in Europe.

An evolutionary ecology approach infers that the strong autumn colors result from the long evolutionary war between the trees and the insects that use them as hosts. During the fall season, which is when the insects suck the amino acids from the leaves and later lay their eggs, the tree colors its leaves in red because aphids are attracted to yellow ones, so as to advertise to the insects as to the defensive quality of the tree in order to lower the tendency of the insects to occupy the leaves for nutrition and the bark for breeding. In this case too, the protective logic of red pigmentation may be sound, but the yellow leaves cannot be reconciled with this approach. But to settle this point, the new theory can be applied.

According to the theory provided by Prof. Lev-Yadun and Prof. Holopainen, until 35 million years ago, large areas of the globe were covered with evergreen jungles or forests composed of tropical trees. During this phase, a series of ice ages and dry spells transpired and many tree species evolved to become deciduous. Many of these trees also began an evolutionary process of producing red deciduous leaves in order to ward off insects. In North America, as in East Asia, north-to-south mountain chains enabled plant and animal 'migration' to the south or north with the advance and retreat of the ice according to the climatic fluctuations. And, of course, along with them migrated their insect 'enemies' too. Thus the war for survival continued there uninterrupted. In Europe, on the other hand, the mountains – the Alps and their lateral branches – reach from east to west, and therefore no protected areas were created. Many tree species that did not survive the severe cold died, and with them the insects that depended on them for survival. At the end of the repeated ice ages, most tree species that had survived in Europe had no need to cope with many of the insects that had become extinct, and therefore no longer had to expend efforts on producing red warning leaves.
According to the scientists, evidence supporting this theory can be found in the dwarf shrubs that grow in Scandinavia, which still color their leaves red in autumn. Unlike trees, dwarf shrubs have managed to survive the ice ages under a layer of snow that covered them and protected them from the extreme condition above. Under the blanket of snow, the insects that fed off the shrubs were also protected – so the battle with insects continued in these plants, making it necessary for them to color their leaves red.

University of Haifa-Oranim

Thursday, August 27, 2009


0 comentarios

While the researchers can't promise delivery to a parallel universe or a school for wizards, books like Pullman's Dark Materials and JK Rowling's Harry Potter are steps closer to reality now that researchers in China have created the first tunable electromagnetic gateway.

The work, 'A simple route to a tunable electromagnetic gateway' is a further advance in the study of metamaterials, published Thursday, 14 August, in New Journal of Physics (co-owned by the Institute of Physics and German Physical Society). It has been published at

In the research paper, the researchers from the Hong Kong University of Science and Technology and Fudan University in Shanghai describe the concept of a "a gateway that can block electromagnetic waves but that allows the passage of other entities" like a "'hidden portal' as mentioned in fictions."

The gateway, which is now much closer to reality, uses transformation optics and an amplified scattering effect from an arrangement of ferrite materials called single-crystal yttrium-iron-garnet that force light and other forms of electromagnetic radiation in complicated directions to create a hidden portal.

Previous attempts at an electromagnetic gateway were hindered by their narrow bandwidth, only capturing a small range of visible light or other forms of electromagnetic radiation. This new configuration of metamaterials however can be manipulated to have optimum permittivity and permeability – able to insulate the electromagnetic field that encounters it with an appropriate magnetic reaction.

Because of the arrangement's response to magnetic fields it also has the added advantage of being tunable and can therefore be switched on and off remotely.

Dr Huanyang Chen from the Physics Department at Hong Kong University of Science and Technology has commented, "In the frequency range in which the metamaterial possesses a negative refraction index, people standing outside the gateway would see something like a mirror. Whether it can block all visible light depends on whether one can make a metamaterial that has a negative refractive index from 300 to 800 nanometres."

Metamaterials, the area of physics research behind the possible creation of a real Harry Potter-style invisibility cloak, are exotic composite materials constructed at the atomic (rather than the usual chemical) level to produce materials with properties beyond those which appear naturally.

(Photo: IoP)

Institute of Physics


0 comentarios
A detailed examination of the wrist bones of several primate species challenges the notion that humans evolved their two-legged upright walking style from a knuckle-walking ancestor.

The same lines of evidence also suggest that knuckle-walking evolved at least two different times, making gorillas distinct from chimpanzees and bonobos.

"We have the most robust data I've ever seen on this topic," said Daniel Schmitt, a Duke University associate professor of evolutionary anthropology. "This model should cause everyone to re-evaluate what they've said before."

The research, led by post-doctoral research associate Tracy Kivell, was supported by the Natural Sciences and Engineering Research Council in her native Canada, General Motors' Women in Science and Mathematics, and the University of Toronto, where Kivell did her Ph.D. work.

The debate over the origins of human bipedalism began during Charles Darwin's lifetime and continues vigorously to this day, commonly dividing into two competing models, the researchers explained.

One model "envisions the pre-human ancestor as a terrestrial knuckle-walker, a behavior frequently used by our closest living relatives, the African apes," they wrote in the PNAS report. The other model traces our two-legged walking to earlier tree-climbing, a mode of locomotion that is used by all living apes.

Supporters of the knuckle-walking origin think we and African apes evolved from a common knuckle walking ancestor. That connection, they contend, is still evident in wrist and hand bone features shared by African apes and by fossil and living humans.

But Kivell found otherwise when she began comparing juvenile and adult wrist bones of more than 100 chimps and bonobos, our closest living primate kin, with those of gorillas.

Significantly, two key features associated with knuckle walking were present in only 6 percent of the gorilla specimens she studied. But she found them in 96 percent of adult chimpanzees and 76 percent of bonobos. In all, she looked at specimens from 91 gorillas, 104 chimps and 43 bonobos.

Kivell and Schmitt suggested that one explanation for the absence of these features in gorillas is that they knuckle-walk in a fundamentally different way from chimps and bonobos. Gorillas stride with their arms and wrists extended straight down and locked in what Kivell called "columnar" stances that resemble how elephants walk. By contrast, chimps and bonobos walk more flexibly, "with their wrists in a bent position as opposed to being stacked-up," she said. "And with their wrists in bent positions there will be more stresses at those joints."

As a result, chimp and bonobo wrists have special features that gorillas lack -- little ridges and concavities that serve as "bony stops" to keep their wrists from over-bending. Gorillas don't need those, she added.

"When we first got together to work on this study that (difference) really jumped out in living color," Schmitt said.

"Then we sat down together and asked: 'What are the differences between them?' Schmitt said. "The answer is that chimps and bonobos spend a lot of time in the trees. And gorillas do not."

Chimpanzees and bonobos have a more extended-wrist way of knuckle-walking which gives them added stability on branches, the researchers concluded. In contrast, gorillas' "columnar" style of knuckle-walking is consistent with ground transport.

Indeed, "from what we know about knuckle-walking among wild populations, gorillas and adult chimpanzees will both knuckle-walk about 85 percent of the time that they're moving," Kivell said. "But chimpanzees and bonobos are more arboreal than gorillas. So they're doing a lot more of it in the trees."

Kivell and Schmitt think this suggests independent evolution of knuckle-walking behavior in the two African ape lineages.

Some scientists point to features in the human anatomy as our own vestiges of a knuckle-walking ancestry. One notable example is the fusion a two wrist bones that could provide us extra stability, a feature we share with gorillas, chimps and bonobos.

But some lemurs have that feature too, and they do a variety of different movements in the trees but do not knuckle-walk, Kivell said.

Altogether, the evidence leans against the idea that our own bipedalism evolved from a knuckle-walking ancestor, the pair wrote. "Instead, our data support the opposite notion, that features of the hand and wrist found in the human fossil record that have traditionally been treated as indicators of knuckle-walking behavior in general are in fact evidence of arboreality."

In other words, a long-ago ancestor species that spent its time in the trees moved to the ground and began walking upright.

There are no fossils from the time of this transition, which likely occurred about seven million years ago, Kivell and Schmitt said. But none of the later fossils considered to be on the direct human line were knuckle-walkers.

Duke University


0 comentarios

Mary had a little lamb, but only once a year. However, Cornell Sheep Program researchers have discovered an unusual form of a gene that prompts ewes to breed out of season as well as conceive at younger ages and more frequently.

They conducted a simple genetic test to identify the presence of the unusual form of the gene, the so-called M allele that other researchers had suspected might be correlated with out-of-season fertility, in their test flock and then validated the gene's relationship with aseasonal breeding by observing that trait in the flock.

The finding, published in the August issue of the Journal of Animal Science (Vol. 87, No. 8), may be a boon for the sheep industry worldwide, especially when combined with the Sheep Program's STAR system -- a method to manage ewes to lamb five times in three years rather than once a year.

"The primary biological limit for sheep production worldwide is the seasonality of breeding, but the market for high-quality lamb is a 52-week thing," said Doug Hogue, professor emeritus of animal science in the College of Agriculture and Life Sciences. His Cornell colleague Mike Thonney and former Cornell postdoctoral researcher Raluca Mateescu, now at Oklahoma State University, co-authored the paper with Andrea Lunsford, a graduate student at OSU.

Although the presence of the M allele has been definitively correlated with the ability to breed out of season, the researchers caution that it may only be a marker for the gene actually responsible for the trait.

"Breeding out of season is a complex trait," Mateescu said, "so there are a lot of genes controlling it." Mateescu observed the phenotype -- the physical expression of the gene -- in the researchers' flock during a postdoctoral fellowship at Cornell.

"In this case, we're talking about a receptor gene for melatonin," Thonney explained. Melatonin is a naturally produced hormone commonly found in many animals. The change in the DNA sequence of the M allele does not change the amino acid sequence of the protein. This means that it may be an accurate indicator for the phenotype of breeding out of season, though it's uncertain whether the gene actually impacts how the sheep's body reacts to melatonin. And there may be a risk of losing the association over generations, the researchers said, as recombination could occur between the marker and the functional gene.

Thus, the researchers stress that it will be very important to validate the gene's ability to indicate for aseasonal breeding each time the allele is bred into a new sheep population.

"I think it's very exciting … we only have one gene, but it's definitely a tool that farmers can use," said Mateescu, who is now focusing on placing markers across the sheep's entire genome to more accurately determine which gene or genes directly affect the trait of aseasonal reproduction.

The allele is particularly useful for management under the STAR system, developed by Hogue and Cornell sheep farm manager Brian Magee in the early 1980s, which uses nutrition and conventional breeding techniques to reduce the time between heats. "If a ewe doesn't get pregnant when she is supposed to, instead of a year, it's only 73 days [using the STAR system] until she has another opportunity," Thonney said.

While the STAR system requires better nutrition and more farm labor to manage the lambing, each lambing event involves fewer ewes than traditional yearly lambing.

The researchers hope that the discovery of the M allele may help the STAR system adapt to consistently high levels of production without any additional risk to flock health.

(Photo: Cornell U.)

Cornell University


0 comentarios

People diagnosed with type 2 diabetes often resist taking insulin because they fear gaining weight, developing low blood sugar and seeing their quality of life decline.

A study recently completed at UT Southwestern Medical Center suggests that those fears are largely unfounded and that patients and physicians should consider insulin as a front-line defense, as opposed to a treatment of last resort for non-insulin-dependent diabetes.

“We found that those patients who received insulin initially did just as well, if not better, than those who didn’t receive insulin,” said Dr. Ildiko Lingvay, assistant professor of internal medicine at UT Southwestern and lead author of the study appearing online and in a future issue of Diabetes Care. “This reinforces the idea that insulin treatment is a viable and safe option for patients, even in the very initial stages of their diagnoses.

“There is a myth out in the community, especially among certain ethnicities, that insulin is the last resort, and that somebody started on insulin is going to die,” Dr. Lingvay added. “We as physicians are responsible for teaching the patient that that’s not the case.”

More than 20 million Americans have type 2 diabetes. Obesity, age and lack of exercise all increase the risk for the disease, which is characterized by a progressive loss of insulin-producing beta cells. Diabetes is the single greatest independent risk factor for heart disease, as well as a contributor to a number of other medical problems, including blindness and kidney disease.

The standard initial treatment for type 2 diabetes is a single drug, often metformin, followed by the addition of more oral hypoglycemic agents as needed.

For this study, researchers evaluated the effectiveness of offering insulin-based therapy as an initial treatment option to newly diagnosed type 2 diabetes patients. They compared rates of compliance, satisfaction, effectiveness, safety and quality of life among the patients, who were randomized to receive either the standard triple oral therapy or insulin plus metformin, an oral drug that helps regulate blood sugar levels.

The patients, ranging in age from 21 to 70 years old, had been diagnosed with type 2 diabetes within the past two months. Researchers recruited study participants from Parkland Memorial Hospital or by self-referral to the Clinical Diabetes Research Clinic at UT Southwestern between November 2003 and June 2005.

After enrollment, every participant followed an insulin and metformin regimen for three months. The patients were then randomized to continue taking insulin and metformin or begin the triple oral therapy regimen. All participants were checked monthly for the first four months, at six months after randomization, and every three months thereafter for three years. Of the 58 patients randomized, 24 of the insulin-treated group and 21 of the triple oral therapy group completed the study.

The researchers found that the patients taking insulin plus metformin had fewer low-blood-sugar, or hypoglycemic, events, gained less weight and reported high satisfaction with the insulin.

Dr. Lingvay said she hopes physicians use these findings as the rationale to offer insulin-metformin as the first, rather than last, line of defense.

“Modern medicine uses insulin as a very effective and safe treatment tool,” she said. “With the new devices that we’re using, giving yourself an insulin shot is not much harder than taking pills.”

The data represent the first three years of a six-year study still under way at UT Southwestern. The next step, Dr. Lingvay said, is to begin analyzing how the insulin plus metformin and oral triple therapy regimens affect insulin production in beta cells.

(Photo: UT Southwestern)

UT Southwestern Medical Center


0 comentarios
In 1953, Stanley Miller filled two flasks with chemicals assumed to be present on the primitive Earth, connected the flasks with rubber tubes and introduced some electrical sparks as a stand-in for lightning. The now famous experiment showed what amino acids, the building blocks of proteins, could easily be generated from this primordial stew. But despite that seminal experiment, neither he nor others were able to take the next step: that of showing how life’s code could come from such humble beginnings.

By working with the simplest amino acids and elementary RNAs, physicists led by Rockefeller University’s Albert J. Libchaber, head of the Laboratory of Experimental Condensed Matter Physics, have now generated the first theoretical model that shows how a coded genetic system can emerge from an ancestral broth of simple molecules. “All these molecules have different properties and these properties define their interactions,” says first author Jean Lehmann, whose work appears in the June issue of PLoS One. “What are the constraints that allow these molecules to self-organize into a code? We can play with that.”

The genetic code is a triplet code such that every triplet sequence of letters on messenger RNA (mRNA) corresponds to one of the 20 amino acids that make up proteins. Molecular adaptors called transfer RNAs (tRNAs) then convert this information into proteins that can achieve some specific tasks in the organism. Let’s say that each triplet sequence on mRNA, known as a codon, represents an outlet that can only accept a tRNA with a complementary anticodon. Translation works because each codon-anticodon match corresponds with an amino acid. As each tRNA is plugged in, a chain of amino acids is formed in the same order as the codons until translation is complete.

However, primitive tRNAs were not as finicky as tRNAs are today and could load any amino acid known to exist during the time of prebiotic Earth. Without the ability of tRNA to discriminate between various amino acids, such a random system might not be able to self-assemble into a highly organized code capable of supporting life.

To find out if it could, Libchaber and Lehmann, together with Michel Cibils at Ecole Polytechnique Federale de Lausanne in Laussane, Switzerland, worked with a simple theoretical system. They took two of the simplest amino acids thought to exist billions of years ago, two primitive tRNAs and an RNA template with two complementary codons, and then developed an algorithm to incrementally change the concentration of each molecule. Their goal was to see which conditions, if any, could coax the system to specifically translate codons in a non-random fashion. They found that the properties of the molecules set the concentrations at which the molecules needed to exist for a coded regime to emerge.

At these concentrations, the scientists found that a vetting process began to unfold whereby the tRNA and the amino acid began to seek each other out. All in all, an elementary translation process depended on two time scales: the time during which a tRNA remains bound to its codon (hybridization) and the time it takes for the amino acid on that tRNA to form a new chemical bond with the amino acid next to it (polymerization).

“It takes a lifetime for the tRNA to dissociate from its codon,” says Libchaber, who is also Detlev W. Bronk Professor at Rockefeller. “If it takes the amino acid loaded on the RNA longer than a lifetime to polymerize to an amino acid nearby, the selection of tRNA and amino acid doesn’t occur. But when the two lifetimes are comparable, even when there is nonspecific loading of an amino acid, a selection process begins to take hold because some amino acids would be more adaptive during that time span -- and start what would be the beginning of a code.”

Although Libchaber and Lehmann point out that the analysis certainly does not provide a full picture of the problem, the work nonetheless brings us one step closer to understanding how Life first began. “The dream of physicists is to create elementary life,” Libchaber says. “Then we would know that we understand something.”

Rockefeller University


0 comentarios

Like clockwork, brain regions in many songbird species expand and shrink seasonally in response to hormones. Now, for the first time, University of Washington neurobiologists have interrupted this natural "annual remodeling" of the brain and have shown that there is a direct link between the death of old neurons and their replacement by newly born ones in a living vertebrate.

The scientists introduced a chemical into one side of sparrow brains in an area that helps control singing behavior to halt apoptosis, a cell suicide program. Twenty days after introduction of the hormones the researchers found that there were 48 percent fewer new neurons than there were in the side of the brain that did not receive the cell suicide inhibitor.

"This is the first demonstration that if you decrease apoptosis you also decrease the number of new brain cells in a live animal. The next step is to understand this process at the molecular level," said Eliot Brenowitz, a UW professor of psychology and biology and co-author of a new study. His co-author is Christopher Thompson, who earned his doctorate at the UW and is now at the Free University of Berlin.

"The seasonal hormonal drop in birds may mimic what is an age-related drop in human hormone levels. Here we have a bird model that is natural and maybe similar genes have a similar function in humans with degenerative diseases such as Alzheimer's and Parkinson's, as well as strokes, which are associated with neuron death."

The research involved Gambel's white-crowned sparrows, a songbird subspecies that winters in California and migrates to Alaska in the spring and summer to breed and raise its young. The sparrow's brain regions, including one called the HVC, which control learned song behavior in males, expand and shrink seasonally. Thompson and Brenowitz previously found that neurons in the HVC begin dying within four days hours after the steroid hormone testosterone is withdrawn from the bird's brains. Thousands of neurons died over this time.

In the new work, the UW researchers received federal and state permission to capture 10 of the sparrows in Eastern Washington at the end of the breeding season. After housing the birds for three months, they castrated the sparrows and then artificially brought them to breeding condition by implanting testosterone and housing them under the same long-day lighting conditions that they would naturally be exposed to in Alaska. This induced full growth of the song control system in the birds' brains.

Next the researchers transitioned the birds to a non-breeding condition by reducing the amount of light they were exposed to and removing the implanted testosterone. They infused the HVC on one side of the brain with chemicals, called caspase inhibitors, that block apoptosis, and two chemical markers that highlight mature and new neurons. Twenty days later the birds were euthanized and sections of their brains were examined under a microscope.

These procedures were done with the approval of the UW's Institutional Animal Care and Use Committee and the National Institute of Mental Health. The latter funded the research.

The HVC straddles both hemispheres of the brain but the two sides are not directly connected. When Thompson counted the number of newly born neurons that had migrated to the HVC, he found only several hundred of them among the hundreds of thousands of mature neurons he examined. And there were nearly half the number of new neurons in the side of the HVC where brain cell death was inhibited compared with the other, untreated side of the HVC.

"This shows there is some direct link between the death of old neurons and the addition of new cells that were born elsewhere in the brain and have migrated," said Brenowitz. "What allows new cells to be incorporated into the brain is the big question. This is particularly true on a molecular level where we want to know what is the connection between cell death and neurogenesis and which genes are responsible."

(Photo: University of Washington)

University of Washington

Wednesday, August 26, 2009


0 comentarios
Excavation team in Zhoukoudian, the legendary site boasts the discovery of Sinanthropus (also nicknamed "Peking Man") has unearthed remains of ashes, burned bones and carbon dust from inside the "Peking Man Cave", providing further proof for the Sinanthropus' using fire.

Chinese scientists put forward the theory that "Peking Man" had begun to "use fire" in 1930s, while oversea researchers doubted that the ashes found in the Peking Man site were remains of natural fire instead of man-made.

Remains of ashes, burned bones and carbon dust found from the excavation layer inside the "Peking Man Cave" provides more support for the theory of Chinese scientists, said Gao Xing, the team leader of the project and Vice Director of the Institute of Vertebrate Paleontology and Paleoanthropology (IVPP), CAS.

On June 24, China formally starts a large-scale rescue excavation project in Zhoukoudian 50 kilometres southwest to Beijing's city center, 72 years after its last excavation efforts of comparable scale in 1937.

Anthropologists have unearthed nearly one thousand vertebrate fossils, most of which are small rodent, insectivores and birds. There are also broken teeth and limb bone fossils of large and medium size animals in the unearthed discoveries.

Besides, researchers also found 5 steinkerns, 37 flake tools, 5 hammerstones, 5 scrapers, 2 choppers, 6 broken stone tools with evident manmade character and 118 stoneworks that researchers reckon are artificial during the excavation.

The Chinese Academy of Sciences


0 comentarios
Stargazers were in for a unique treat several nights ago: the planet Earth passed through the debris train of the Swift-Tuttle comet which astronomers call the Perseid meteor shower.

Dartmouth College geography professor emeritus and geographer Vincent H. Malmström had a theory in 1973 that the shooting stars an ancient Native American tribe saw in the sky thousands of years ago was a sign that something important was about to happen.

"The shooting stars that will be observed this evening are part of a recurring celestial phenomena that heralded the beginning of recorded time in America exactly 3,367 years ago tonight, on August 13, -1358 (1359 B.C.)," said Malmström August 13.

In 1992, the Swift-Tuttle comet passed the Earth, a trip it makes once every 130 years. The Zoque, a Native American tribe in what is now southern Mexico, first noted it and initiated the earliest calendar in the Americas. The following day at noon, the sun passed directly overhead at their principal site, now known to archaeologists as Izapa, giving rise to a 260-day calendar that became the time-count subsequently adopted by most of the early peoples of Mesoamerica, including the Mayas and the Aztecs.

Malmström's book on the Mesoamerican calendar, "Cycles of the Sun, Mysteries of the Moon", was published by the University of Texas Press in 1997 and in 2008, using NASA data, he demonstrated how the Mayan people learned to predict lunar eclipses.

Dartmouth College


0 comentarios

A team of scientists has found a new planet which orbits the wrong way around its host star. The planet, named WASP-17, and orbiting a star 1000 light years away, was found by the UK's WASP project in collaboration with Geneva Observatory. The discovery, which casts new light on how planetary systems form and evolve, is being announced in a paper submitted to Astrophysical Journal.

Since planets form out of the same swirling gas cloud that creates a star, they are expected to orbit in the same direction that the star spins. Graduate students David Anderson, of Keele University, and Amaury Triaud, of Geneva Observatory, were surprised to find that WASP-17 is orbiting the wrong way, making it the first planet known to have a ``retrograde'' orbit. The likely explanation is that WASP-17 was involved in a near collision with another planet early in its history.

WASP-17 appears to have been the victim of a game of planetary billiards, flung into its unusual orbit by a close encounter with a ``big brother'' planet. Professor Coel Hellier, of Keele University, remarks: "Shakespeare said that two planets could no more occupy the same orbit than two kings could rule England; WASP-17 shows that he was right.”

David Anderson added “Newly formed solar systems can be violent places. Our own moon is thought to have been created when a Mars-sized planet collided with the recently formed Earth and threw up a cloud of debris that turned into the moon. A near collision during the early, violent stage of this planetary system could well have caused a gravitational slingshot, flinging WASP-17 into its backwards orbit.”

The first sign that WASP-17 was unusual was its large size. Though it is only half the mass of Jupiter it is bloated to nearly twice Jupiter's size, making it the largest planet known.

Astronomers have long wondered why some extra-solar planets are far bigger than expected, and WASP-17 points to the explanation. Scattered into a highly elliptical, retrograde orbit, it would have been subjected to intense tides. Tidal compression and stretching would have heated the gas-giant planet to its current, hugely bloated extent. "This planet is only as dense as expanded polystyrene, seventy times less dense than the planet we're standing on", notes Prof. Hellier.

Professor Keith Mason, Chief Executive of the Science and Technology Facilities Council, which funded the research, said, “This is a fascinating new find and another triumph for the WASP team. Not only are they locating these far flung and mysterious planets but revealing more about how planetary systems, such as our own Solar System, formed and evolved. The WASP team has proved once again why this project is currently the World's most successful project searching for transiting exoplanets.”

WASP-17 is the 17th new exoplanet (planet outside our solar system) found by the Wide Area Search for Planets (WASP) consortium of UK universities. The WASP team detected the planet using an array of cameras that monitor hundreds of thousands of stars, searching for small dips in their light when a planet transits in front of them. Geneva Observatory then measured the mass of WASP-17, showing that it was the right mass to be a planet. The WASP-South camera array that led to the discovery of WASP-17 is hosted by the South African Astronomical Observatory.

(Photo: NASA/Hubble)

Science and Technology Facilities Council


0 comentarios

When bees sting, they pump poison into their victims. Now the toxin in bee venom has been harnessed to kill tumor cells by researchers at Washington University School of Medicine in St. Louis. The researchers attached the major component of bee venom to nano-sized spheres that they call nanobees.

In mice, nanobees delivered the bee toxin melittin to tumors while protecting other tissues from the toxin's destructive power. The mice's tumors stopped growing or shrank. The nanobees' effectiveness against cancer in the mice is reported in advance online publication Aug. 10 in the Journal of Clinical Investigation.

"The nanobees fly in, land on the surface of cells and deposit their cargo of melittin which rapidly merges with the target cells," says co-author Samuel Wickline, M.D., who heads the Siteman Center of Cancer Nanotechnology Excellence at Washington University. "We've shown that the bee toxin gets taken into the cells where it pokes holes in their internal structures."

Melittin is a small protein, or peptide, that is strongly attracted to cell membranes, where it can form pores that break up cells and kill them.

"Melittin has been of interest to researchers because in high enough concentration it can destroy any cell it comes into contact with, making it an effective antibacterial and antifungal agent and potentially an anticancer agent," says co-author Paul Schlesinger, M.D., Ph.D., associate professor of cell biology and physiology. "Cancer cells can adapt and develop resistance to many anticancer agents that alter gene function or target a cell's DNA, but it's hard for cells to find a way around the mechanism that melittin uses to kill."

The scientists tested nanobees in two kinds of mice with cancerous tumors. One mouse breed was implanted with human breast cancer cells and the other with melanoma tumors. After four to five injections of the melittin-carrying nanoparticles over several days, growth of the mice's breast cancer tumors slowed by nearly 25 percent, and the size of the mice's melanoma tumors decreased by 88 percent compared to untreated tumors.

The researchers indicate that the nanobees gathered in these solid tumors because tumors often have leaky blood vessels and tend to retain material. Scientists call this the enhanced permeability and retention effect of tumors, and it explains how certain drugs concentrate in tumor tissue much more than they do in normal tissues.

But the researchers also developed a more specific method for making sure nanobees go to tumors and not healthy tissue by loading the nanobees with additional components. When they added a targeting agent that was attracted to growing blood vessels around tumors, the nanobees were guided to precancerous skin lesions that were rapidly increasing their blood supply. Injections of targeted nanobees reduced the extent of proliferation of precancerous skin cells in the mice by 80 percent.

Overall, the results suggest that nanobees could not only lessen the growth and size of established cancerous tumors but also act at early stages to prevent cancer from developing.

"Nanobees are an effective way to package the useful, but potentially deadly, melittin, sequestering it so that it neither harms normal cells nor gets degraded before it reaches its target," Schlesinger says.

If a significant amount of melittin were injected directly into the bloodstream, widespread destruction of red blood cells would result. The researchers showed that nanoparticles protected the mice's red cells and other tissues from the toxic effects of melittin. Nanobees injected into the bloodstream did not harm the mice. They had normal blood counts, and tests for the presence of blood-borne enzymes indicative of organ damage were negative.

When secured to the nanobees, melittin is safe from protein-destroying enzymes that the body produces. Although unattached melittin was cleared from the mice's circulation within minutes, half of the melittin on nanobees was still circulating 200 minutes later. Schlesinger indicates that is long enough for the nanobees to circulate through the mice's bloodstream 200 times, giving them ample time to locate tumors.

"Melittin is a workhorse," says Wickline, also professor of medicine in the Cardiovascular Division and professor of physics, of biomedical engineering and of cell biology and physiology. "It's very stable on the nanoparticles, and it's easily and cheaply produced. We are now using a nontoxic part of the melittin molecule to hook other drugs, targeting agents or imaging compounds onto nanoparticles."

The core of the nanobees is composed of perfluorocarbon, an inert compound used in artificial blood. The research group developed perfluorocarbon nanoparticles several years ago and have been studying their use in various medical applications, including diagnosis and treatment of atherosclerosis and cancer. About six millionths of an inch in diameter, the nanoparticles are large enough to carry thousands of active compounds, yet small enough to pass readily through the bloodstream and to attach to cell membranes.

"We can add melittin to our nanoparticles after they are built," Wickline says. "If we've already developed nanoparticles as carriers and given them a targeting agent, we can then add a variety of components using native melittin or melittin-like proteins without needing to rebuild the carrier. Melittin fortunately goes onto the nanoparticles very quickly and completely and remains on the nanobee until cell contact is made."

The flexibility of nanobees and other nanoparticles made by the group suggests they could be readily adapted to fit medical situations as needed. The ability to attach imaging agents to nanoparticles means that the nanoparticles can give a visible indication of how much medication gets to tumors and how tumors respond.

"Potentially, these could be formulated for a particular patient," Schlesinger says. "We are learning more and more about tumor biology, and that knowledge could soon allow us to create nanoparticles targeted for specific tumors using the nanobee approach."

(Photo: WUSTL)

Washington University School of Medicine


0 comentarios

Beneath northern India’s irrigated fields of wheat, rice, and barley ... beneath its densely populated cities of Jaiphur and New Delhi, the groundwater has been disappearing. Halfway around the world, hydrologists, including Matt Rodell of NASA, have been hunting for it.

Where is northern India’s underground water supply going? According to Rodell and colleagues, it is being pumped and consumed by human activities -- principally to irrigate cropland -- faster than the aquifers can be replenished by natural processes. They based their conclusions -- published in the August 20 issue of Nature -- on observations from NASA’s Gravity Recovery and Climate Experiment (GRACE).

"If measures are not taken to ensure sustainable groundwater usage, consequences for the 114 million residents of the region may include a collapse of agricultural output and severe shortages of potable water," said Rodell, who is based at NASA’s Goddard Space Flight Center in Greenbelt, Md.

Groundwater comes from the natural percolation of precipitation and other surface waters down through Earth’s soil and rock, accumulating in aquifers -- cavities and layers of porous rock, gravel, sand, or clay. In some of these subterranean reservoirs, the water may be thousands to millions of years old; in others, water levels decline and rise again naturally each year.

Groundwater levels do not respond to changes in weather as rapidly as lakes, streams, and rivers do. So when groundwater is pumped for irrigation or other uses, recharge to the original levels can take months or years.

Changes in underground water masses affect gravity enough to provide a signal, such that changes in gravity can be translated into a measurement of an equivalent change in water.

"Water below the surface can hide from the naked eye, but not from GRACE," said Rodell. The twin satellites of GRACE can sense tiny changes in Earth’s gravity field and associated mass distribution, including water masses stored above or below Earth’s surface. As the satellites orbit 300 miles above Earth's surface, their positions change -- relative to each other -- in response to variations in the pull of gravity. The satellites fly roughly 137 miles apart, and microwave ranging systems measure every microscopic change in the distance between the two.

With previous research in the United States having proven the accuracy of GRACE in detecting groundwater, Rodell and colleagues Isabella Velicogna, of NASA’s Jet Propulsion Laboratory and the University of California-Irvine, and James Famiglietti, of UC-Irvine, were looking for a region where they could apply the new technique.

"Using GRACE satellite observations, we can observe and monitor water changes in critical areas of the world, from one month to the next, without leaving our desks," said Velicogna. "These satellites provide a window to underground water storage changes."
The northern Indian states of Rajasthan, Punjab and Haryana have all of the ingredients for groundwater depletion: staggering population growth, rapid economic development and water-hungry farms, which account for about 95 percent of groundwater use in the region.

Data provided by India's Ministry of Water Resources suggested groundwater use was exceeding natural replenishment, but the regional rate of depletion was unknown. Rodell and colleagues had their case study. The team analyzed six years of monthly GRACE gravity data for northern India to produce a time series of water storage changes beneath the region’s land surface.

They found that groundwater levels have been declining by an average of one meter every three years (one foot per year). More than 109 cubic km (26 cubic miles) of groundwater disappeared between 2002 and 2008 -- double the capacity of India's largest surface water reservoir, the Upper Wainganga, and triple that of Lake Mead, the largest man-made reservoir in the United States.

"We don’t know the absolute volume of water in the Northern Indian aquifers, but GRACE provides strong evidence that current rates of water extraction are not sustainable," said Rodell. "The region has become dependent on irrigation to maximize agricultural productivity, so we could be looking at more than a water crisis."

The loss is particularly alarming because it occurred when there were no unusual trends in rainfall. In fact, rainfall was slightly above normal for the period.

The researchers examined data and models of soil moisture, lake and reservoir storage, vegetation and glaciers in the nearby Himalayas, in order to confirm that the apparent groundwater trend was real. Nothing unusual showed up in the natural environment.
The only influence they couldn’t rule out was human.

"At its core, this dilemma is an age-old cycle of human need and activity -- particularly the need for irrigation to produce food," said Bridget Scanlon, a hydrologist at the Jackson School of Geosciences at the University of Texas in Austin. "That cycle is now overwhelming fresh water reserves all over the world. Even one region’s water problem has implications beyond its borders."

"For the first time, we can observe water use on land with no additional ground-based data collection," Famiglietti said. "This is critical because in many developing countries, where hydrological data are both sparse and hard to access, space-based methods provide perhaps the only opportunity to assess changes in fresh water availability across large regions."

(Photo: NASA/Matt Rodell)





Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com