Friday, August 20, 2010


0 comentarios

Unfortunately, when ancient kings sent letters to each other, their post offices didn't record the sender' return address. It takes quite a bit of super-sleuthing by today's archaeologists to determine the geographical origin of this correspondence — which can reveal a great deal about ancient rulers and civilizations.

Now, by adapting an off-the-shelf portable x-ray lab tool that analyzes the composition of chemicals, Prof. Yuval Goren of Tel Aviv University's Department of Archaeology and Ancient Near Eastern Civilizations can reveal hidden information about a tablet's composition without damaging the precious ancient find itself. These x-rays reveal the soil and clay composition of a tablet or artefact, to help determine its precise origin.

But Prof. Goren's process, based on x-ray fluorescence (XRF) spectrometry, can go much further. Over the years, he has collected extensive data through physical "destructive" sampling of artefacts. By comparing this data to readouts produced by the XRF device, he's built a table of results so that he can now scan a tablet — touching the surface of it gently with the machine — and immediately assess its clay type and the geographical origin of its minerals.

The tool, he says, can also be applied to coins, ancient plasters, and glass, and can be used on site or in a lab. He plans to make this information widely available to other archaeological researchers.

Prof. Goren's field intersects the worlds of geology, mineralogy and ancient technology as he tries to understand where ancient tablets and pots are made, based on the crystals and minerals found in the materials of these artefacts.

Traditionally archaeological scientists have had to take small samples of an artefact — a chip or a slice — in order to analyze its soil and clay composition. But as more and more museums and archaeology sites ban these destructive means of investigating archaeological finds, Prof. Goren's new tool may help save archaeological structures while solving some of its deepest mysteries.

"It's become a big ethical question," says Prof. Goren. "Many museums will not allow any more physical sampling of artefacts, and it's especially problematic for small tablet fragments and stamps which cannot be broken in the process. I had to find another way to know what these artefacts were made of."

In his recent study published in the Israel Exploration Journal, Prof. Goren and his colleagues investigated a Late Bronze Age letter written in the Akkadian language and found among the Ophel excavations in Jerusalem.

Its style suggests that it is a rough and contemporary tablet of the Amarna letters — letters written from officials throughout the Middle East to the Pharaohs in Egypt around 3,500 years ago, pre-biblical times. Using his device, Prof. Goren was able to determine that the letter is made from raw material typical to the Terra Rossa soils of the Central Hill Country around Jerusalem. This determination helped to confirm both the origin of the letter and possibly its sender.

"We believe this is a local product written by Jerusalem scribes, made of locally available soil. Found close to an acropolis, it is also likely that the letter fragment does in fact come from a king of Jerusalem," the researchers reported, adding that it may well be an archival copy of a letter from King Abdi-Heba, a Jesubite king in Jerusalem, to the Pharaoh in nearby Egypt.

Prof. Goren is also an expert at uncovering archaeological forgeries and has worked on the alleged ossuary, or bone box, of Jesus' brother James.

(Photo: TAU)

Tel Aviv University


0 comentarios
New research published Friday, 6 August, in IOP Publishing's Bioinspiration & Biomimetics, describes how the researchers from the Center of Excellence 'Cognitive Interaction Technology' at Bielefeld University, Germany, have built an artificial bee eye, complete with fully functional camera, to shed light on the insects' complex sensing, processing and navigational skills.

Consisting of a light-weight mirror-lens combination attached to a USB video camera, the artificial eye manages to achieve a field of vision comparable to that of a bee. In combining a curved reflective surface that is built into acrylic glass with lenses covering the frontal field, the bee eye camera has allowed the researchers to take unique images showing the world from an insect's viewpoint.

In the future, the researchers hope to include UV to fully reflect a bee's colour vision, which is important to honeybees for flower recognition and discrimination and also polarisation vision, which bees use for orientation. They also hope to incorporate models of the subsequent neural processing stages.

As the researchers write, "Despite the discussed limitations of our model of the spatial resolution of the honeybees compound eyes, we are confident that it is useful for many purposes, e.g. for the simulation of bee-like agents in virtual environments and, in combination with presented imaging system, for testing bee-inspired visual navigation strategies on mobile robots."

IOP Science


0 comentarios
The way that humanity reacts to climate change may do more damage to many areas of the planet than climate change itself unless we plan properly, an important new study published in Conservation Letters by Conservation International's Will Turner and a group of other leading scientists has concluded.

The paper Climate change: helping nature survive the human response, looks at efforts to both reduce emissions of greenhouse gases and potential action that could be taken by people to adapt to a changed climate and assesses the potential impact that these could have on global ecosystems.

In particular it notes that one fifth of the world's remaining tropical forests lie within 50km of human populations that could be inundated if sea levels rise by 1m. These forests would make attractive sources of fuel-wood, building materials, food and other key resources and would be likely to attract a population forced to migrate by rising sea levels. About half of all Alliance for Zero Extinction sites – which contain the last surviving members of certain species – are also in these zones.

Dr Turner said: "There are numerous studies looking at the impacts of climate change on biodiversity, but very little time has been taken to consider what our responses to climate change might do to the planet."

The paper notes that efforts to reduce greenhouse gas emissions by constructing dams for hydropower generation can cause substantial damage to key freshwater ecosystems as well as to the flora and fauna in the flooded valleys. It also notes that the generally bogus concept that biofuels reduce carbon emissions is still being used as a justification for the felling of large swathes of biodiverse tropical forests.

The report also reviews studies examining the complex series of outcomes in historical examples of climate change and environmental degradation, and humanity's efforts to adapt to changing circumstances. Migration caused in part by climatic instability in Burkina Faso in the late 20th century, for example, led to a 13 per cent decline in forest cover as areas were cleared for agriculture, and a decline in fish supplies in Ghana may have led to a significant increase in bushmeat hunting.

Dr Turner added: "If we don't take a look at the whole picture, but instead choose to look only at small parts of it we stand to make poor decisions about how to respond that could do more damage than climate change itself to the planet's biodiversity and the ecosystem services that help to keep us all alive.

"While the Tsunami in 2004 was not a climate event, many of the responses that it stimulated are comparable with how people will react to extreme weather events – and the damage that the response to the Tsunami did to many of Aceh province's important ecosystems as a result of extraction of timber and other building materials, and poor choices of locations for building , should be a lesson to us all."

Although the challenge of sustaining biodiversity in the face of climate change seems daunting, the paper notes that we must – and can – rise to the challenge.

Turner adds: "Climate change mitigation and adaptation are essential. We have to ensure that these responses do not compromise the biodiversity and ecosystem services upon which societies ultimately depend. We have to reduce emissions, we have to ensure the stability of food supplies jeopardized by climate change, we have to help people survive severe weather events – but we must plan these things so that we don't destroy life-sustaining forests, wetlands, and oceans in the process.'

The paper concludes that there are many ways of ensuring that the human response to climate change delivers the best possible outcomes for both society and the environments, and notes that in particular, maintaining and restoring natural habitats are among the cheapest, safest, and easiest solutions at our disposal to reduce greenhouse-gas emissions and help people adapt to unavoidable changes.

Dr Turner said: "Providing a positive environmental outcome is often the best way to ensure the best outcome for people. If we are sensible, we can help people and nature together cope with climate change, if we are not it will cause suffering for people and serious problems for the environment."



0 comentarios
A research study by scientists at the Babraham Institute, an institute of BBSRC, into the effects of electromagnetic radiation on cells has today been published in the online journal PLoS One, revealing that acute (30 minute) application of GSM radiation has no effect on calcium transport inside biological cells.

The research project, part of the independently managed Mobile Telecommunications and Health Research (MTHR) programme funded by the Department of Health and industry, aimed to establish whether the digital pulsed radiofrequency from modern GSM mobile phones influenced calcium levels in cells. Calcium signals are at the heart of many cell communication processes throughout our lives, controlling how we grow and develop into healthy adults. Due to this central role of calcium, interference with its regulation by electromagnetic radiation could have serious consequences on cell physiology and human health.

The extensive, high-sensitivity analyses carried out in this study revealed that GSM mobile phone-type radiation had no effect on calcium signalling processes in three important types of cells, which had previously been suggested to be sensitive to radiofrequency fields; two neuronal cell types and a cell type that surrounds blood vessels.

Although the way in which electromagnetic fields could affect human biology is not established, it has been suggested that the types of radiofrequencies emitted by mobile phones may interfere with calcium signalling processes in cells; as calcium ions have a 'charge' they would be a natural candidate to interact with electromagnetic radiation. Apart from its role in forming teeth and bones, calcium is a vital messenger inside cells, affecting many cell processes including muscle contraction, fertilisation and neural activity as well as switching genes on and off at critical times during development. Sometimes these calcium signals can be altered or activated at the wrong time, causing harmful changes in cell behaviour. Problems with calcium levels in the body underlie numerous health problems including high blood pressure, heart failure, cancer and bipolar disorders.

The research, carried out at the Babraham Institute, an institute of the BBSRC, investigated whether acute radiofrequency radiation emitted by mobile phones caused changes in calcium ions inside cells from the brain and blood vessels. A technique known as 'fluorescence imaging' was used to assess whether radiation akin to that emitted by mobile phones influenced cell signalling in mammalian cells growing in the laboratory. The behaviour of thousands of cells exposed to mobile phone-type radiation for periods of 30 minutes, a realistic time for a phone call, was individually analysed. Different types of cells were used to see if there was any difference in sensitivity. In addition, the team looked at the effect of radiation levels on resting levels of calcium inside the cells and also on calcium signals that were deliberately triggered by the addition of a hormone or other stimulus.

The Babraham Institute is a centre for studying the basic biology of signalling inside and between cells, supporting the BBSRC's mission to drive advances in fundamental bioscience for better health and improved quality of life across the life course, reducing the need for medical and social intervention. Dr Martin Bootman, Group Leader in Molecular Signalling at the Babraham Institute explained, "Our very sensitive equipment is able to detect even tiny changes in cell behaviour. However, we did not find any significant effect of radiofrequency exposure on cellular calcium, even with the highest radiation power which exceeds typical mobile phone emissions. This study indicates that calcium within cells is not acutely affected by mobile phone-type emissions. If individuals are sensitive to electromagnetic radiation it would have to be through a mechanism that does not involve changes in calcium transport."

Rod O'Connor, lead author of the paper commented, "The automated imaging system we constructed for these experiments allowed us to detect even the slightest change in calcium. We collaborated with some of the best physicists and engineers in the world to design a system to expose cells to mobile phone radiation. We worked very hard to make sure we had the dosimetry right and were very open-minded to what we might find. If short-term exposure to GSM radiation caused a change in calcium in cells, it would have been seen in these experiments. However, we were unable to find any evidence that GSM exposure had any influence on cellular calcium signals in these studies".



0 comentarios

A new discovery of fossilised footprints reveals when reptiles first conquered dry land.

The 318-million-year-old reptile footprints were found in sea-cliffs on the Bay of Fundy, New Brunswick, Canada. They show that reptiles were the first vertebrates (animals with a backbone) to conquer dry continental interiors. These pioneers paved the way for the diverse ecosystems that exist on land today.

The footprints were discovered by Dr Howard Falcon-Lang of Royal Holloway, University of London. The results of his study, undertaken with Professor Mike Benton of the University of Bristol and Canadian colleagues, are published today in the journal Palaeogeography, Palaeoclimatology, Palaeoecology.

It has long been suspected that reptiles were the first to make the continental interiors their home. This is because reptiles do not need to return to water to breed unlike their amphibian cousins. The new discovery of footprints proves this theory. The rocks in which they occur show that the reptiles lived on dry river plains hundreds of miles from the sea.

Professor Benton said: “The footprints date from the Carboniferous Period when a single supercontinent (Pangaea) dominated the world. At first life was restricted to coastal swamps where lush rainforest existed, full of giant ferns and dragonflies. However, when reptiles came on the scene they pushed back the frontiers, conquering the dry continental interiors.”

The same team reported the oldest known reptile footprints from a different site in New Brunswick in 2007. The new discovery is of similar age, and may be even older.

Dr Falcon-Lang added: “The Bay of Fundy is such an amazing place to hunt for fossils. The sea-cliffs are rapidly eroding and each rock-fall reveals exciting new fossils. You just never know what will turn up next.”

(Photo: Bristol U.)

University of Bristol


0 comentarios

Urban design is making us fat and needs to work harder to be healthier, warns a leading academic.

Tim Townshend, director of planning and urban design at Newcastle University, says decades of creating car-focused urban environments is beginning to show in our waistlines.

“Our urban landscape is full of shopping malls and fast food restaurants, escalators and huge car parks with people battling to get the space closest to the doors so they don’t have to walk very far,” he said. "These environments are simply not designed for people to walk around in.

“We need to think seriously about what kind of environment we are creating for ourselves and have a sensible debate about what’s acceptable and what’s not in our towns and cities. Health needs to be back on the town planning agenda before it’s too late.”

With UK rates of obesity predicted to rise to half the population by 2050, there’s not much time left to reverse the trend.

There are many well-documented factors that influence obesity. At its simplest level, it is caused by eating too much and not getting enough physical activity, but obesity is actually an extremely complex issue.

Our built environment and how it allows, or prevents, us from taking healthy and unhealthy lifestyle choices is now recognised as an area we know too little about.

Many of the examples in his co-edited book Obesogenic Environments: complexities, perceptions and objective measures come from the USA and Australia, where there are more low density car-orientated suburbs - often referred to as ‘urban sprawl’ - which have become a focus of concern in obesity research.

“Although we’re not as extreme as these countries we’re still making some of the same mistakes and some different ones too,” said Mr Townshend. “We don’t tend to build very low density suburbs but we do go for lots of houses without any local services and poor transport links which force people into their cars.”

However, it is possible to ‘build in’ more active spaces into our towns and cities to avoid creating these environments, allowing people to take exercise almost without noticing.

Recent attempts to make everything level to improve accessibility have also meant people get less exercise, but Mr Townshend suggests you can easily cater for everyone’s needs if you design well.

Green spaces and street trees encourage more walking whereas graffiti and litter and poorly maintained areas deter pedestrians. “We need to provide more green spaces – how many new parks do we build? Obesity is the biggest social and health problem we face and it will take a holistic approach to create new, healthier neighbourhoods with health professionals working alongside planners, designers and policy makers,” he said.

Co-editor Dr Amelia Lake, Northumbria University, supports this approach. “Our research shows that it is as much the responsibility of an urban designer as it is a nutritionist to reverse the obesity trend.

“It’s not just down to individual choices - society has to create an environment where people have healthier alternatives. The current situation is that the unhealthy option is the easy one. We need to reverse that and create environments where healthy food is the easier, affordable and most accessible option.”

The researchers suggest that this important issue reflects many of the things already being talked about in terms of sustainability, such as reducing car use, creating local food networks and being more active through cycling and walking.

“In many ways it’s a win-win situation as we’re not talking about costly new ways of going about how we develop towns and cities in the future,” said Mr Townshend. “Planners already have the power to make a difference. For example, there is a special planning category for fast food restaurants so you can avoid having a street full of them, or resist placing them near schools or leisure centres. It just takes a bit of thought at the beginning of the planning process.”

One of the concerns is that when local authorities are already stretched and having to make cutbacks, they can be more desperate to approve unsuitable housing developments.

“Developers don’t like providing services such as shops or schools because they’re complicated and the returns aren’t as high as they can get from housing alone,” he said. “Local developers need to work with planners to provide sustainable, healthier local solutions rather than simply rolling out housing developments which look like they’re off the same production line wherever you are in the country.”

(Photo: Newcastle U.)

Newcastle University


0 comentarios
A team of Harvard physicists led by Mikhail D. Lukin has achieved the first-ever quantum entanglement of photons and solid-state materials. The work marks a key advance toward practical quantum networks, as the first experimental demonstration of a means by which solid-state quantum bits, or "qubits," can communicate with one another over long distances.

Quantum networking applications such as long-distance communication and distributed computing would require the nodes that process and store quantum data in qubits to be connected to one another by entanglement, a state where two different atoms become indelibly linked such that one inherits the properties of the other.

"In quantum computing and quantum communication, a big question has been whether or how it would be possible to actually connect qubits, separated by long distances, to one another," says Lukin, professor of physics at Harvard and co-author of a paper describing the work in this week's issue of the journal Nature. "Demonstration of quantum entanglement between a solid-state material and photons is an important advance toward linking qubits together into a quantum network."

Quantum entanglement has previously been demonstrated only with photons and individual ions or atoms.

"Our work takes this one step further, showing how one can engineer and control the interaction between individual photons and matter in a solid-state material," says first author Emre Togan, a graduate student in physics at Harvard. "What's more, we show that the photons can be imprinted with the information stored in a qubit."

Quantum entanglement, famously termed "spooky action at a distance" by a skeptical Albert Einstein, is a fundamental property of quantum mechanics. It allows one to distribute quantum information over tens of thousands of kilometers, limited only by how fast and how far members of the entangled pair can propagate in space.

The new result builds upon earlier work by Lukin's group to use single atom impurities in diamonds as qubits. Lukin and colleagues have previously shown that these impurities can be controlled by focusing laser light on a diamond lattice flaw where nitrogen replaces an atom of carbon. That previous work showed that the so-called spin degrees of freedom of these impurities make excellent quantum memory.

Lukin and his co-authors now say that these impurities are also remarkable because, when excited with a sequence of finely tuned microwave and laser pulses, they can emit photons one at a time, such that photons are entangled with quantum memory. Such a stream of single photons can be used for secure transmission of information.

"Since photons are the fastest carriers of quantum information, and spin memory can robustly store quantum information for relatively long periods of time, entangled spin-photon pairs are ideal for the realization of quantum networks," Lukin says. "Such a network, a quantum analog to the conventional internet, could allow for absolutely secure communication over long distances."

Harvard University


0 comentarios
Many neuroscientists believe the loss of the brain region known as the amygdala would result in the brain's inability to form new memories with emotional content. New UCLA research indicates this is not so and suggests that when one brain region is damaged, other regions can compensate.

The research appears in the early online edition of the journal Proceedings of the National Academy of Sciences (PNAS).

"Our findings show that when the amygdala is not available, another brain region called the bed nuclei can compensate for the loss of the amygdala," said the study's senior author, Michael Fanselow, a UCLA professor of psychology and a member of the UCLA Brain Research Institute.

"The bed nuclei are much slower at learning, and form memories only when the amygdala is not learning," he said. "However, when you do not have an amygdala, if you have an emotional experience, it is like neural plasticity (the memory-forming ability of brain cells) and the bed nuclei spring into action. Normally, it is as if the amygdala says, 'I'm doing my job, so you shouldn't learn.' With the amygdala gone, the bed nuclei do not receive that signal and are freed to learn."

The amygdala is believed to be critical for learning about and storing the emotional aspects of experience, Fanselow said, and it also serves as an alarm to activate a cascade of biological systems to protect the body in times of danger. The bed nuclei are a set of forebrain gray matter surrounding the stria terminalis; neurons here receive information from the prefrontal cortex and hippocampus and communicate with several lower brain regions that control stress responses and defensive behaviors.

"Our results suggest some optimism that when a particular brain region that is thought to be essential for a function is lost, other brain regions suddenly are freed to take on the task," Fanselow said. "If we can find ways of promoting this compensation, then we may be in a better position to help patients who have lost memory function due to brain damage, such as those who have had a stroke or have Alzheimer's disease.

"Perhaps this research can eventually lead to new drugs and teaching regimens that facilitate plasticity in the regions that have the potential to compensate for the damaged areas," he said.

While the current study shows this relationship for emotional learning, additional research in Fanselow's laboratory is beginning to suggest this is a general property of memory.

University of California


0 comentarios

Imagine a material that is tougher than Kelvar or steel, yet remarkably flexible. It's something you can easily find in your attic or a lingerie store. It's as instantly recognizable today as it was to our early ancestors, yet we still aren't sure exactly how it's made.

The miracle thread in question is natural silk, the ubiquitous fibers made by spiders and silkworms, which has been used throughout history for items ranging from stockings and parachutes to surgical sutures. Today scientists and engineers are creating a number of useful materials based on silk research. But many researchers believe these applications may just be the start of a whole web of useful new products and devices, if only we had a better understanding of just how these small creatures spin their precious thread. In recent years, researchers have worked to gain a better understanding of what silk is and how it's made, with the goal of being able to consistently replicate and enhance its production synthetically. In the July 30 edition of the journal Science, two Tufts University researchers, Fiorenzo G. Omenetto and David L. Kaplan, review the state of silk research, the challenges that remain, and why synthetic silk production is so appealing.

According to Omenetto and Kaplan, scientists understand that silk is "a relatively simple protein processed from water." Research has established what those proteins are, and they have determined that the properties of silk can vary a great deal depending on factors such as the outside temperature, how fast the silk is spun, and the exact type of silk created.

But no one knows how exactly the spiders and silk worms actually make silk. Scientists have determined they don't secrete the stuff, but instead pull it out of special glands in very specific ways. Spiders, for example, pull it with their legs, while silkworms perform a ‘figure eight' dance with their heads to create the silk threads. Despite this knowledge, Omenetto and Kaplan write, "there are still significant knowledge gaps in understanding how to reverse-engineer silk protein fibers."

The spiders and silkworms have also figured out another neat trick that, according to Omenetto and Kaplan, still evades the capabilities of their would-be mechanical copycats. When the scientists try to store silk proteins in the lab, they find they must do so under exacting conditions, or material will quickly begin to crystallize. Nature's silk makers, on the other hand, don't seem to have this problem. They can store the raw silk materials internally at a variety of temperatures for days and even weeks without encountering the crystallization problem, and at this point in time, the authors write, no one is sure how they do it.

One goal of silk research, Omenetto and Kaplan write, is to find a way to genetically engineer other organisms to produce custom-designed silk proteins that could then be used to produce synthetic silk for specific purposes on a large scale. This has led to genetically modified mushrooms, bacteria and even goats that are able to produce silk protein, yet none of the actual silk produced from these modified organisms matches the qualities of the stuff produced by spiders and silk worms. Once these issues are overcome, however, Omenetto and Kaplan believe that someday, plants could be modified to produce silk as a crop, like cotton is harvested today.

So why all of this focus on silk? Omenetto and Kaplan say that figuring out how to replicate and modify silk could lead to new breakthroughs in medicine, among other fields. Although silk is used in sutures today, the authors explain, it has to be coated in wax, which prevents the sutures from being gradually absorbed into the body. Modified silks could be wax free, Omenetto and Kaplan write, and could be used to safely administer drugs within the body or even create "degradable and flexible electronic displays for improved physiological recording" of a person's body. These and other intriguing possibilities await, Omenetto and Kaplan say, if we can just figure out how exactly the spider spins that web.

(Photo: David Kaplan, Tufts University)

The National Science Foundation


0 comentarios

Meteorologists have determined exactly how much carbon dioxide humans can emit into the atmosphere while ensuring that the earth does not heat up by more than two degrees.

The Intergovernmental Panel on Climate Change (IPCC) calculated projected temperature changes for various scenarios in 2007 and researchers at the Max Planck Institute for Meteorology in Hamburg have now gone one step further: they have developed a new model that specifies the maximum volumes of carbon dioxide that humans may emit to remain below the critical threshold for climate warming of two degrees Celsius. To do this, the scientists incorporated into their calculations data relating to the carbon cycle, namely the volume of carbon dioxide absorbed and released by the oceans and forests. The aim of the international ENSEMBLES project is to simulate future changes in the global climate and carbon dioxide emissions and thereby to obtain more reliable threshold values on this basis.

The concentration of carbon dioxide in the atmosphere caused by the combustion of fossil fuels (gas, oil) has increased by around 35 percent since the beginning of the Industrial Revolution. If carbon dioxide emissions and, as a result, atmospheric carbon dioxide concentrations continue to increase unchecked, a drastic increase in the global temperature can be expected before the end of this century. With the help of new models for a prescribed atmospheric carbon dioxide concentration, scientists from all over Europe have now calculated for the first time the extent to which the global carbon dioxide emissions must be reduced to halt global warming.

"What’s new about this research is that we have integrated the carbon cycle into our model to obtain the emissions data," says Erich Roeckner. According to the model, admissible carbon dioxide emissions will increase from approximately seven billion tonnes of carbon in the year 2000 to a maximum value of around ten billion tonnes in 2015. In order to achieve the long-term stabilisation of the atmospheric carbon dioxide concentration, the emissions will then have to be reduced by 56 percent by the year 2050 and approach zero towards the end of this century. Although, based on these calculations, global warming would remain under the two-degree threshold until 2100, further warming may be expected in the long term: "It will take centuries for the global climate system to stabilise," says Erich Roeckner.

The scientists used a new method with which they reconstructed historical emission pathways on the basis of already-calculated carbon dioxide concentrations. To do this, Erich Roeckner and his team adopted the methodology proposed by the International Panel on Climate Change (IPCC) for simulations being carried out for the future Fifth IPCC Assessment Report: earth system models that incorporate the carbon cycle were used to estimate the anthropogenic carbon dioxide emissions that are compatible with a prescribed concentration pathway. In this case, the emissions depend solely on the proportion of the anthropogenic carbon in the model that is absorbed by the land surface and the oceans. Repetition of the experiments using different pre-industrial starting dates enabled the scientists to distinguish between anthropogenic climate change and internal climate variability.

The model used for this study is based on a low-resolution spatial grid with a grid spacing of around 400 kilometres, which takes the atmosphere, plus the land surface, the ocean, including sea ice, and the marine and terrestrial carbon cycle into account.

The overall aim of the study is to simulate future changes in the climate and carbon dioxide emissions in a single scenario in which the carbon dioxide equivalent concentrations in the atmosphere are stabilised in the long term at 450 parts per million (ppm), so that global warming increases to a maximum of two degrees above the pre-industrial level. The data are currently being evaluated by other European climate centres. "As soon as all of the results are available, we can evaluate the spread between the models," says Erich Roeckner. "The more significant the data we have, the more accurate our forecast will be."

(Photo: Max Planck Institute for Meteorology)

Max Planck Institute


0 comentarios

Like an ice cube on a warm day, most materials melt — that is, change from a solid to a liquid state — as they get warmer. But a few oddball materials do the reverse: They melt as they get cooler. Now a team of researchers at MIT has found that silicon, the most widely used material for computer chips and solar cells, can exhibit this strange property of “retrograde melting” when it contains high concentrations of certain metals dissolved in it.

The material, a compound of silicon, copper, nickel and iron, “melts” (actually turning from a solid to a slush-like mix of solid and liquid material) as it cools below 900 degrees Celsius, whereas silicon ordinarily melts at 1414 degrees C. The much lower temperatures make it possible to observe the behavior of the material during melting, based on specialized X-ray fluorescence microprobe technology using a synchrotron — a type of particle accelerator — as a source.

The material and its properties are described in a paper just published online in the journal Advanced Materials. Team leader Tonio Buonassisi, the SMA Assistant Professor of Mechanical Engineering and Manufacturing, is the senior author, and the lead authors are Steve Hudelson MS ’09, and postdoctoral fellow Bonna Newman PhD ’08.

The findings could be useful in lowering the cost of manufacturing some silicon-based devices, especially those in which tiny amounts of impurities can significantly reduce performance. In the material that Buonassisi and his researchers studied, impurities tend to migrate to the liquid portion, leaving regions of purer silicon behind. This could make it possible to produce some silicon-based devices, such as solar cells, using a less pure, and therefore less expensive, grade of silicon that would be purified during the manufacturing process.

“If you can create little liquid droplets inside a block of silicon, they serve like little vacuum cleaners to suck up impurities,” Buonassisi says. This research could also lead to new methods for making arrays of silicon nanowires — tiny tubes that are highly conductive to heat and electricity.

Buonassisi predicted in a 2007 paper that it should be possible to induce retrograde melting in silicon, but the conditions needed to produce such a state, and to study it at a microscopic level, are highly specialized and have only recently become available. To create the right conditions, Buonassisi and his team had to adapt a microscope “hot-stage” device that allowed the researchers to precisely control the rate of heating and cooling. And to actually observe what was happening as the material was heated and cooled, they drew upon high-power synchrotron-based X-ray sources at Lawrence Berkeley National Laboratory in California and at Argonne National Laboratory in Illinois (researchers from both national labs are co-authors of the paper).

The research was supported by the U.S. Department of Energy, the National Science Foundation, the Clare Booth Luce Foundation, Doug Spreng and the Chesonis Family Foundation, and some equipment was provided by McCrone Scientific.

The material for the tests consisted of a kind of sandwich made from two thin layers of silicon, with a filling of copper, nickel and iron between them. This was first heated enough to cause the metals to dissolve into the silicon, but below silicon’s melting point. The amount of metal was such that the silicon became supersaturated — that is, more of the metal was dissolved in the silicon than would normally be possible under stable conditions. For example, when a liquid is heated, it can dissolve more of another material, but then when cooled down it can become supersaturated, until the excess material precipitates out.

In this case, where the metals were dissolved into the solid silicon, “if you begin cooling it down, you hit a point where you induce precipitation, and it has no choice but to precipitate out in a liquid phase,” Buonassisi says. It is at that point that the material melts.

Matthias Heuer, a senior research scientist at Calisolar, a solar-energy startup company, says this work is “unique and new to our field,” and it “allows some very good insight into how transition metals and structural defects interact.” But he adds that there are a number of questions still to be answered in follow-up research: “Now that we know liquid inclusions can form, the question is, how efficient as sinks for impurities are they? How stable are they? Can they keep the impurities localized during other process steps — for example, during the final firing process of a solar cell?”

(Photo: Patrick Gillooly)

Massachusetts Institute of Technology


0 comentarios

Resveratrol, a popular plant extract shown to prolong life in yeast and lower animals due to its anti-inflammatory and antioxidant properties, appears also to suppress inflammation in humans, based on results from the first prospective human trial of the extract conducted by University at Buffalo endocrinologists.

Results of the study appear as a rapid electronic publication on the Journal of Clinical Endocrinology & Metabolism website and will be published in an upcoming print issue of the journal.

The paper also has been selected for inclusion in Translational Research in Endocrinology & Metabolism, a new online anthology that highlights the latest clinical applications of cutting-edge research from the journals of the Endocrine Society.

Resveratrol is a compound produced naturally by several plants when under attack by pathogens such as bacteria or fungi, and is found in the skin of red grapes and red wine. It also is produced by chemical synthesis derived primarily from Japanese knotweed and is sold as a nutritional supplement.

Husam Ghanim, PhD, UB research assistant professor of medicine and first author on the study, notes that resveratrol has been shown to prolong life and to reduce the rate of aging in yeast, roundworms and fruit flies, actions thought to be affected by increased expression of a particular gene associated with longevity.

The compound also is thought to play a role in insulin resistance as well, a condition related to oxidative stress, which has a significant detrimental effect on overall health.

"Since there are no data demonstrating the effect of resveratrol on oxidative and inflammatory stress in humans," says Paresh Dandona, MD, PhD, UB distinguished professor of medicine and senior author on the study, "we decided to determine if the compound reduces the level of oxidative and inflammatory stress in humans.

"Several of the key mediators of insulin resistance also are pro-inflammatory, so we investigated the effect of resveratrol on their expression as well."

The study was conducted at Kaleida Health's Diabetes-Endocrinology Center of Western New York, which Dandona directs.

A nutritional supplement containing 40 milligrams of resveratrol was used as the active product. Twenty participants were randomized into two groups of 10: one group received the supplement, while the other group received an identical pill containing no active ingredient. Participants took the pill once a day for six weeks. Fasting blood samples were collected as the start of the trial and at weeks one, three and six.

Results showed that resveratrol suppressed the generation of free radicals, or reactive oxygen species, unstable molecules known to cause oxidative stress and release proinflammatory factors into the blood stream, resulting in damage to the blood vessel lining.

Blood samples from persons taking resveratrol also showed suppression of the inflammatory protein tumor necrosis factor (TNF) and other similar compounds that increase inflammation in blood vessels and interfere with insulin action, causing insulin resistance and the risk of developing diabetes.

These inflammatory factors, in the long term, have an impact on the development of type 2 diabetes, aging, heart disease and stroke, noted Dandona.

Blood samples from the participants who received the placebo showed no change in these pro-inflammatory markers.

While these results are promising, Dandona added a caveat: The study didn't eliminate the possibility that something in the extract other than resveratrol was responsible for the anti-inflammatory effects.

"The product we used has only 20 percent resveratrol, so it is possible that something else in the preparation is responsible for the positive effects. These agents could be even more potent than resveratrol. Purer preparations now are available and we intend to test those."

(Photo: U. Buffalo)

University at Buffalo




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com