Friday, July 3, 2009


0 comentarios
With the caloric needs of the planet expected to soar by 50 percent in the next 40 years, planning and investment in global agriculture will become critically important, according a new report released June 25.

The report, produced by Deutsche Bank, one of the world's leading global investment banks, in collaboration with the University of Wisconsin-Madison's Nelson Institute for Environmental Studies, provides a framework for investing in sustainable agriculture against a backdrop of massive population growth and escalating demands for food, fiber and fuel.

"We are at a crossroads in terms of our investments in agriculture and what we will need to do to feed the world population by 2050," says David Zaks, a co-author of the report and a researcher at the Nelson Institute's Center for Sustainability and the Global Environment.

By 2050, world population is expected to exceed 9 billion people, up from 6.5 billion today. Already, according to the report, a gap is emerging between agricultural production and demand, and the disconnect is expected to be amplified by climate change, increasing demand for biofuels, and a growing scarcity of water.

"There will come a point in time when we will have difficulties feeding world population," says Zaks, a graduate student whose research focuses on the patterns, trends and processes of global agriculture.

Although unchecked population growth will put severe strains on global agriculture, demand can be met by a combination of expanding agriculture to now marginal or unused land, substituting new types of crops, and technology, the report's authors conclude. "The solution is only going to come about by changing the way we use land, changing the things that we grow and changing the way that we grow them," Zaks explains.

The report notes that agricultural research and technological development in the United States and Europe have increased notably in the last decade, but those advances have not translated into increased production on a global scale. Subsistence farmers in developing nations, in particular, have benefited little from such developments and investments in those agricultural sectors have been marginal, at best.

The Deutsche Bank report, however, identifies a number of strategies to increase global agricultural productions in sustainable ways, including:

•Improvements in irrigation, fertilization and agricultural equipment using technologies ranging from geographic information systems and global analytical maps to the development of precision, high performance equipment.
•Applying sophisticated management and technologies on a global scale, essentially extending research and investment into developing regions of the world.
•Investing in "farmer competence" to take full advantage of new technologies through education and extension services, including investing private capital in better training farmers.
•Intensifying yield using new technologies, including genetically modified crops.
•Increasing the amount of land under cultivation without expanding to forested lands through the use of multiple cropping, improving degraded crop and pasturelands, and converting productive pastures to biofuel production.

"First we have to improve yield," notes Zaks. "Next, we have to bring in more land in agriculture while considering the environmental implications, and then we have to look at technology."

Bruce Kahn, Deutsche Bank senior investment analyst, echoed Zaks observations: "What is required to meet the challenge of feeding a growing population in a warming world is to boost yield through highly sophisticated land management with precision irrigation and fertilization methods," said Kahn, a graduate of the Nelson Institute. "Farmers, markets and governments will have to look at a host of options including increased irrigation, mechanization, fertilization and the potential benefits of biotech crops."

The Deutsche Bank report depended in part on an array of global agricultural analytical tools, maps, models and databases developed by researchers at UW-Madison's Center for Sustainability and the Global Environment. Those tools, including global maps of land supply for crops and pasture, were developed primarily for academic research, says Zaks. The Deutsche Bank report, he continues, is evidence that such tools will have increasing applications in plotting a course for sustainable global agriculture.

University of Wisconsin


0 comentarios
For millions of years, dinosaurs have been considered the largest creatures ever to walk on land. While they still maintain this status, a new study suggests that some dinosaurs may actually have weighed as little as half as much as previously thought.

In the study, published in the Journal of Zoology, Geoffrey Birchard, associate professor of environmental science and policy at George Mason University, was part of a team which uncovered a problem with the statistical model used by some scientists in the dinosaur community to estimate the mass of dinosaurs.

"The original equation used by scientists produces fairly accurate results when determining the mass of smaller animals, but when used on larger animals our research shows that many errors have occurred," says Birchard. "The new equation shows that dinosaurs are much smaller than we thought, but there is no mistaking that they were indeed huge animals."

Developed in 1985, the results of the original equation have been used by scientists to estimate or evaluate a variety of parameters, including brain size and egg size. The problem occurs as a result of transforming the data, which changes the properties of the original data, and creates biases that can affect the predictive results obtained from the equation.

Birchard and his colleagues realized there was an error when they used the equation to determine the weight of living animals such as a hippopotamus and an elephant and discovered that the equation greatly overestimated the weight of these animals.

The researchers developed a new equation for calculating dinosaur mass based on bone dimensions. This equation doesn't require the transformation of data that the original equation uses.

"The best way to understand the new equation is to think about a building that is built on pillars," says Birchard. "The bigger the building, the larger the pillars must be to support the weight of the building. In the same way, the legs of an animal are the pillars supporting its body."

According to Birchard, this new research suggests that some dinosaurs were much more slender than had been thought. It also changes many of the factors scientists have already determined about dinosaurs such as the amount of muscle required to use their bodies and how much they ate and breathed.

George Mason University


0 comentarios

The "coming of age" of galaxies and black holes has been pinpointed, thanks to new data from NASA's Chandra X-ray Observatory and other telescopes. This discovery helps resolve the true nature of gigantic blobs of gas observed around very young galaxies.

About a decade ago, astronomers discovered immense reservoirs of hydrogen gas -- which they named "blobs" – while conducting surveys of young distant galaxies. The blobs are glowing brightly in optical light, but the source of immense energy required to power this glow and the nature of these objects were unclear.

A long observation from Chandra has identified the source of this energy for the first time. The X-ray data show that a significant source of power within these colossal structures is from growing supermassive black holes partially obscured by dense layers of dust and gas. The fireworks of star formation in galaxies are also seen to play an important role, thanks to Spitzer Space Telescope and ground-based observations.

"For ten years the secrets of the blobs had been buried from view, but now we've uncovered their power source," said James Geach of Durham University in the United Kingdom, who led the study. "Now we can settle some important arguments about what role they played in the original construction of galaxies and black holes."

Galaxies are believed to form when gas flows inwards under the pull of gravity and cools by emitting radiation. This process should stop when the gas is heated by radiation and outflows from galaxies and their black holes. Blobs could be a sign of this first stage, or of the second.

Based on the new data and theoretical arguments, Geach and his colleagues show that heating of gas by growing supermassive black holes and bursts of star formation, rather than cooling of gas, most likely powers the blobs. The implication is that blobs represent a stage when the galaxies and black holes are just starting to switch off their rapid growth because of these heating processes. This is a crucial stage of the evolution of galaxies and black holes - known as "feedback" - and one that astronomers have long been trying to understand.

"We're seeing signs that the galaxies and black holes inside these blobs are coming of age and are now pushing back on the infalling gas to prevent further growth," said coauthor Bret Lehmer, also of Durham. "Massive galaxies must go through a stage like this or they would form too many stars and so end up ridiculously large by the present day."

Chandra and a collection of other telescopes including Spitzer have observed 29 blobs in one large field in the sky dubbed "SSA22." These blobs, which are several hundred thousand light years across, are seen when the Universe is only about two billion years old, or roughly 15% of its current age.

In five of these blobs, the Chandra data revealed the telltale signature of growing supermassive black holes - a point-like source with luminous X-ray emission. These giant black holes are thought to reside at the centers of most galaxies today, including our own. Another three of the blobs in this field show possible evidence for such black holes. Based on further observations, including Spitzer data, the research team was able to determine that several of these galaxies are also dominated by remarkable levels of star formation.

The radiation and powerful outflows from these black holes and bursts of star formation are, according to calculations, powerful enough to light up the hydrogen gas in the blobs they inhabit. In the cases where the signatures of these black holes were not detected, the blobs are generally fainter. The authors show that black holes bright enough to power these blobs would be too dim to be detected given the length of the Chandra observations.

Besides explaining the power source of the blobs, these results help explain their future. Under the heating scenario, the gas in the blobs will not cool down to form stars but will add to the hot gas found between galaxies. SSA22 itself could evolve into a massive galaxy cluster.

"In the beginning the blobs would have fed their galaxies, but what we see now are more like leftovers," said Geach. "This means we'll have to look even further back in time to catch galaxies and black holes in the act of forming from blobs."

(Photo: Left panel: X-ray (NASA/CXC/Durham Univ./D.Alexander et al.); Optical (NASA/ESA/STScI/IoA/S.Chapman et al.); Lyman-alpha Optical (NAOJ/Subaru/Tohoku Univ./T.Hayashino et al.); Infrared (NASA/JPL-Caltech/Durham Univ./J.Geach et al.); Right, Illustration: NASA/CXC/M.Weiss)

Harvard-Smithsonian Center for Astrophysics


0 comentarios

Gene regulatory networks in cell nuclei are similar to cloud computing networks, such as Google or Yahoo!, researchers report today in the online journal Molecular Systems Biology. The similarity is that each system keeps working despite the failure of individual components, whether they are master genes or computer processors.

This finding by an international team led by Carnegie Mellon University computational biologist Ziv Bar-Joseph helps explain not only the robustness of cells, but also some seemingly incongruent experimental results that have puzzled biologists.

“Similarities in the sequences of certain master genes allow them to back up each other to a degree we hadn’t appreciated,” said Bar-Joseph, an assistant professor of computer science and machine learning and a member of Carnegie Mellon’s Ray and Stephanie Lane Center for Computational Biology.

Between 5 and 10 percent of the genes in all living species are master genes that produce proteins called transcription factors that turn all other genes on or off. Many diseases are associated with mutations in one or several of these transcription factors. However, as the new study shows, if one of these genes is lost, other “parallel” master genes with similar sequences, called paralogs, often can replace it by turning on the same set of genes.

That would explain the curious results of some experiments in organisms ranging from yeast to humans, in which researchers have recently identified the genes controlled by several master genes. Researchers have been surprised to find that when they remove one master gene at a time, almost none of the genes controlled by that master gene are de-activated.

In the current work, the Carnegie Mellon researchers and their colleagues in Israel and Spain identified the most probable backup for each master gene. They found that removing the master genes that had very similar backups had almost no noticeable effect, but when they removed master genes with less similar backups, the effect was significant. Additional experiments showed that when both the master gene and its immediate backup were removed, the effects became very noticeable, even for those genes with a similar backup gene. In one example, when the gene Pdr1 was removed, researchers found almost no decrease in activation among the genes it controls; when Pdr1 and its paralog were removed, however, 19 percent of the genes Pdr1 controls failed to activate.

“It’s extremely rare in nature that a cell would lose both a master gene and its backup, so for the most part cells are very robust machines,” said Anthony Gitter, a graduate student in Carnegie Mellon’s Computer Science Department and lead author of the article. “We now have reason to think of cells as robust computational devices, employing redundancy in the same way that enables large computing systems, such as Amazon, to keep operating despite the fact that servers routinely fail.”

(Photo: Carnegie Mellon)

Carnegie Mellon


0 comentarios

The ability to learn and to establish new memories is essential to our daily existence and identity; enabling us to navigate through the world. A new study by researchers at the Montreal Neurological Institute and Hospital (The Neuro), McGill University and University of California, Los Angeles has captured an image for the first time of a mechanism, specifically protein translation, which underlies long-term memory formation.

The finding provides the first visual evidence that when a new memory is formed new proteins are made locally at the synapse - the connection between nerve cells - increasing the strength of the synaptic connection and reinforcing the memory. The study published in Science, is important for understanding how memory traces are created and the ability to monitor it in real time will allow a detailed understanding of how memories are formed.

When considering what might be going on in the brain at a molecular level two essential properties of memory need to be taken into account. First, because a lot of information needs to be maintained over a long time there has to be some degree of stability. Second, to allow for learning and adaptation the system also needs to be highly flexible.

For this reason, research has focused on synapses which are the main site of exchange and storage in the brain. They form a vast but also constantly fluctuating network of connections whose ability to change and adapt, called synaptic plasticity, may be the fundamental basis of learning and memory.

"But, if this network is constantly changing, the question is how do memories stay put, how are they formed? It has been known for some time that an important step in long-term memory formation is "translation", or the production, of new proteins locally at the synapse, strengthening the synaptic connection in the reinforcement of a memory, which until now has never been imaged," says Dr. Wayne Sossin, neuroscientist at The Neuro and co-investigator in the study. "Using a translational reporter, a fluorescent protein that can be easily detected and tracked, we directly visualized the increased local translation, or protein synthesis, during memory formation. Importantly, this translation was synapse-specific and it required activation of the post-synaptic cell, showing that this step required cooperation between the pre and post-synaptic compartments, the parts of the two neurons that meet at the synapse. Thus highly regulated local translation occurs at synapses during long-term plasticity and requires trans-synaptic signals."

Long-term memory and synaptic plasticity require changes in gene expression and yet can occur in a synapse-specific manner. This study provides evidence that a mechanism that mediates this gene expression during neuronal plasticity involves regulated translation of localized mRNA at stimulated synapses. These findings are instrumental in establishing the molecular processes involved in long-term memory formation and provide insight into diseases involving memory impairment.

(Photo: Science)

McGill University


0 comentarios

From balloons to rubber bands, things always break faster when stretched. Or do they? University of Illinois scientists studying chemical bonds now have shown this isn't always the case, and their results may have profound implications for the stability of proteins to mechanical stress and the design of new high-tech polymers.

"Our findings contradict the intuitive notion that molecules are like rubber bands in that when we pull on a chemical bond, it should always break faster," said chemistry professor Roman Boulatov, who led the study. "When we stretch a sulfur-sulfur bond, for example, how fast it breaks depends on how the nearby atoms move."

The findings also contradict the conventional interpretation of experimental results obtained by other researchers studying the fragmentation rate of certain proteins containing sulfur-sulfur bonds when stretched with a microscopic force probe. In those experiments, as the force increased, the proteins fragmented faster, leading the researchers to conclude that as the sulfur-sulfur bond was stretched, it reacted faster and broke faster.

"Our experiments suggest a different conclusion," Boulatov said. "We believe the acceleration of the fragmentation was caused by a change in the protein's structure as it was stretched, and had little or nothing to do with increased reactivity of a stretched sulfur-sulfur bond."

In their experiments, the researchers use stiff stilbene as a molecular force probe to generate well-defined forces on molecules atom by atom.

The probe allows reaction rates to be measured as a function of the restoring force. Similar to the force that develops when a rubber band is stretched, the molecular restoring force contains information about how much the molecule was distorted, and in what direction.

In previous work, when Boulatov's team pulled on carbon-carbon bonds with the same force they would later apply to sulfur-sulfur bonds, they found the carbon-carbon bonds broke a million times faster than when no force was applied.

"Because the sulfur-sulfur bond is much weaker than the carbon-carbon bond, you might think it would be much more sensitive to being pulled on," Boulatov said. "We found, however, that the sulfur-sulfur bond does not break any faster when pulled."

Boulatov and his team report their findings in a paper accepted for publication in Angewandte Chemie, and posted on the journal's Web site.

"When we pulled on the sulfur-sulfur bond, the nearby methylene groups prevented the rest of the molecule from relaxing," Boulatov said, "thus eliminating the driving force for the sulfur-sulfur bond to break any faster."

Chemists must bear in mind that even in simple chemical reactions, such as a single bond dissociation, "we must take into account other structural changes in the molecule," Boulatov said. "The elongation alone, which occurs when a bond is stretched, does not represent the full picture of what happens when the reaction occurs."

The good news, Boulatov said, is that not every polymer that is stretched will break faster. "We might be able to design polymers, for example, that would resist fragmentation under modest mechanical stresses," he said, "or will not break along the stretched direction, but in some other desired direction."

(Photo: L. Brian Stauffer)

University of Illinois


0 comentarios

The saying “Sleep on it” is turning out to be true. Researchers at the Laboratoire de physiologie de la perception et de l'action (LPPA, a CNRS/Collège de France research unit) and at the University of Amsterdam have shown that the brain replays the day's events while asleep. They have discovered that the information consolidated during sleep leads to the best decision-making while awake.

These results are published online on the website of the journal Nature Neuroscience.

While asleep, the neurons in our brain are constantly active at levels comparable to those observed while awake. This activity is of vital importance: during sleep, our brain unconsciously rearranges memories in order to stabilize them and ensure long term memory storage.

Researchers at the LPPA (CNRS/Collège de France) and the University of Amsterdam taught rats to find rewards in a maze and then studied the brains of the animals while they were asleep. The scientific team observed that the activity patterns of neurons during sleep resembled those of previous awake periods, while the rats were learning the task. But not all parts of the experience are replayed in the same manner in the brain. The neuron assemblies that were active at the moment of taking a decision and especially those active from the moment of understanding and learning the task were preferentially reactivated during sleep.

This discovery suggests that not all memorized information is consolidated in the same way and that only the most relevant information for behavior is stabilized. This work represents a significant advance in our understanding of memory in the brain and the manner in which it encodes information. This approach is needed to improve our knowledge in cases of age-related neuronal degeneration such as Alzheimer's disease, in which memory is considerably disrupted.

(Photo: © Peyrache et al)

The Centre National de la Recherche Scientifique


0 comentarios

Columbia University researchers have shown for the first time that two brain systems are primarily responsible for allowing humans to accurately predict the emotions of others. Psychology professors Kevin Ochsner and Niall Bolger, graduate student Jamil Zaki and research assistant Jochen Weber used functional MRI scans to zero in on the parts of the brain that people use when correctly discerning how others are feeling.

“Prior work has only shown us what goes on in the brain when you’re reacting to or thinking about another person’s emotions,” said Ochsner. “Until now, we haven’t known whether and how these parts of the brain really make you accurate.”

The researchers videotaped 11 volunteers discussing emotional events in their lives, such as the birth of a child or the loss of a parent or grandparent. The volunteers then watched their videotapes and rated, moment-to-moment, how positively or negatively they had felt while talking.

Later, a new group of 16 volunteers, dubbed “perceivers,” watched each video. They rated the emotions experienced by each speaker while lying down in a functional MRI scanner, which measures blood-flow in the brain. Researchers then compared the two sets of ratings to judge the perceivers’ “empathic accuracy.”

The researchers found a correlation between the perceivers’ level of accuracy and their reliance on two kinds of brain systems: regions of the parietal and premotor cortex that help people understand the simple intentions behind simple gestures, and the medial prefrontal cortex, responsible for interpreting the meaning of those gestures and putting them into context.

Interestingly, in cases where perceivers were inaccurate, they engaged a third region: one of those responsible for controlling and responding to one’s own emotions.

“It may be the case that when you’re focusing in on your personal experience while watching someone else, you may be missing the cues that they’re giving off,” said Zaki, adding that that finding would need further investigation.

The paper’s authors now plan to apply their findings to the study of autism and, more broadly, to the understanding of social dysfunction. Their methodology may help researchers predict which autism patients will fare better or worse in social settings and to track patients’ progress through treatment.

“There’s the potential of using this to study social function in everyday life,” said Ochsner. “This paradigm could help us figure out why some people are good at interacting with others and have lots of strong, healthy relationships and why others don’t.”

(Photo: Kevin Ochsner and Niall Bolger)

Columbia University in the City of New York


0 comentarios

MIT civil engineers have for the first time identified what causes the most frequently used building material on earth -- concrete -- to gradually deform, decreasing its durability and shortening the lifespan of infrastructures such as bridges and nuclear waste containment vessels.

In a paper published in the Proceedings of the National Academy of Sciences (PNAS) online Early Edition the week of June 15, researchers say that concrete creep (the technical term for the time-dependent deformation that occurs in concrete when it is subjected to load) is caused by the rearrangement of particles at the nano-scale.

"Finally, we can explain how creep occurs," said Professor Franz-Josef Ulm, co-author of the PNAS paper. "We can't prevent creep from happening, but if we slow the rate at which it occurs, this will increase concrete's durability and prolong the life of the structures. Our research lays the foundation for rethinking concrete engineering from a nanoscopic perspective."

This research comes at a time when the American Society of Civil Engineers has assigned an aggregate grade of D to U.S. infrastructure, much of which is made of concrete. It likely will lead to concrete infrastructure capable of lasting hundreds of years rather than tens, which will bring enormous cost-savings and decreased concrete-related CO2 emissions. An estimated 5 to 8 percent of all human-generated atmospheric CO2 worldwide comes from the concrete industry.

Ulm, who has spent nearly two decades studying the mechanical behavior of concrete and its primary component, cement paste, has in the past several years focused on its nano-structure. This led to his publication of a paper in 2007 that said the basic building block of cement paste at the nano-scale -- calcium-silicate-hydrates, or C-S-H -- is granular in nature. The paper explained that C-S-H naturally self-assembles at two structurally distinct but chemically similar phases when mixed with water, each with a fixed packing density close to one of the two maximum densities allowed by nature for spherical objects (64 percent for the lower density and 74 percent for high).

In the new research revealed in the PNAS paper, Ulm and co-author Matthieu Vandamme explain that concrete creep comes about when these nano-meter-sized C-S-H particles rearrange into altered densities: some looser and others more tightly packed.

They also explain that a third, more dense phase of C-S-H can be induced by carefully manipulating the cement mix with other minerals such as silica fumes, a waste material of the aluminum industry. These reacting fumes form additional smaller particles that fit into the spaces between the nano-granules of C-S-H, spaces that were formerly filled with water. This has the effect of increasing the density of C-S-H to up to 87 percent, which in turn greatly hinders the movement of the C-S-H granules over time.
"There is a search by industry to find an optimal method for creating such ultra-high-density materials based on packing considerations in confined spaces, a method that is also environmentally sustainable," said Ulm. "The addition of silica fumes is one known method in use for changing the density of concrete; we now know from the nanoscale packing why the addition of fumes reduces the creep of concrete. From a nanoscale perspective, other means now exist to achieve such highly packed, slow-creeping materials."

"The insight gained into the nanostructure puts concrete on equal footing with high-tech materials, whose microstructure can be nanoengineered to meet specific performance criteria: strength, durability and a reduced environmental footprint," said Vandamme, who earned a PhD from MIT's Department of Civil and Environmental Engineering in 2008 and is now on the faculty of the Ecole des Ponts ParisTech, Université Paris-Est.

In their PNAS paper, the researchers show experimentally that the rate of creep is logarithmic, which means slowing creep increases durability exponentially. They demonstrate mathematically that creep can be slowed by a rate of 2.6. That would have a truly remarkable effect on durability: a containment vessel for nuclear waste built to last 100 years with today's concrete could last up to 16,000 years if made with an ultra-high-density (UHD) concrete.

Ulm stressed that UHD concrete could alter structural designs, as well as have enormous environmental implications, because concrete is the most widely produced man-made material on earth: 20 billion tons per year worldwide with a 5 percent increase annually. More durable concrete means that less building material and less frequent renovations will be required.

"The thinner the structure, the more sensitive it is to creep, so up until now, we have been unable to build large-scale lightweight, durable concrete structures," said Ulm. "With this new understanding of concrete, we could produce filigree: light, elegant, strong structures that will require far less material."

Ulm and Vandamme achieved their research findings using a nano-indentation device, which allows them to poke and prod the C-S-H (or to use the terminology of civil engineering, apply load) and measure in minutes creep properties that are usually measured in year-long creep experiments at the macroscopic scale.

(Photo: L. Barry Hetherington)

Massachusetts Institute of Technology


0 comentarios
Although there is no doubt that hypnosis can impact the mind and behavior, the underlying brain mechanisms are not well understood. Now, new research provides fascinating insight into the specific neural effect of the power of suggestion. The study, published by Cell Press in the June 25 issue of the journal Neuron, uncovers the influence of hypnotic paralysis on brain networks involved in internal representations and self imagery.

Previous research has revealed suggestion-induced changes in brain activity underlying memory, pain perception, and voluntary movement and led to the suggestion that the effects of hypnosis may involve engagement of brain processes that mediate executive control and attention. However, none of these studies directly tested whether an inhibition or disconnection of executive control systems actually caused the observed changes in neural activity.

A group of researchers from the Neuroscience Center and Medical School at the University of Geneva designed an experiment to assess motor and inhibitory brain circuits during hypnosis-induced paralysis. "We used functional magnetic resonance imaging to directly test whether a hypnotic suggestion of paralysis activates specific inhibitory processes and whether these may or may not correspond to those responsible for inhibition in nonhypnotic conditions," explains lead study author Dr. Yann Cojan.

Specifically, subjects performed a task where they prepared to make a hand movement in response to a cue and then, depending on the signal, did or did not execute the prepared movement. Some subjects were hypnotized with the suggestion that their left hand was paralyzed while others were instructed to simulate left hand paralysis. Dr. Cojan and colleagues found that hypnosis produced distributed changes in prefrontal and parietal areas involved in attention along with striking modifications in the functional connectivity of the motor cortex with other brain areas.

Importantly, despite the suggestion of paralysis, the motor cortex was normally activated during the preparation phase of the task. This suggests that hypnosis did not suppress activity in motor pathways or eliminate representation of motor intentions. Hypnosis was also associated with an enhanced activation of the precuneus, a brain region involved in memory and self imagery, and with a reconfiguration of executive control mediated by the frontal lobes.

The researchers conclude that hypnosis induces a disconnection of motor commands from normal voluntary processes under the influence of brain circuits involved in executive control and self imagery. "These results suggest that hypnosis may enhance self-monitoring processes to allow internal representations generated by the suggestion to guide behavior but does not act through direct motor inhibition," says Dr. Cojan. "These findings make an important new step towards establishing neurobiological foundations for the striking impact of hypnosis on the brain and behavior."

Cell Press


0 comentarios

Anthropologists consider food storage to be a vital component in the economic and social package that comprises the Neolithic period, contributing to plant domestication, increasingly sedentary lifestyles and new social organizations. It has traditionally been assumed that people only started to store significant amounts of food when plants were domesticated.

However, in a paper appearing in the June 23 edition of the Proceedings of the National Academies of Sciences, Kuijt and Bill Finlayson, director, Council for British Research in the Levant, describe recent excavations at Dhra' near the Dead Sea in Jordan that provide evidence of granaries that precede the emergence of fully domesticated plants and large-scale sedentary communities by at least 1,000 years.

"These granaries reflect new forms of risk reduction, intensification and low-level food production," Kuijt said. "People in the Pre-Pottery Neolithic Age (11,500 to 10,550 B.C.) were not using new food sources, but rather, by developing new storage methods, they altered their relationship with traditionally utilized food resources and created the technological context for later development of domesticated plants and an agro-pastoralist economy.

"Building granaries may, at the same time, have been the single most important feature in increasingly sedentism that required active community participation in new life-ways."

Designed with suspended floors for air circulation and protection from rodents, the granaries are located between residential structures that contain plant-processing instillations.

The new studies are a continuation of earlier research by Kuijt. As a graduate student from 1987-1995, he worked on and directed several field projects in Jordan that focused on the world's first villages during the Neolithic Period. As part of this research, he did several days of excavation at Dhra' with a Jordanian researcher. This was followed by several other field projects and by research from 2000 to 2005 with Finlayson.

"These granaries are a critical fist step, if not the very evolutionary and technological foundation, for the development of large agricultural villages that appear by 9,500 to 9,000 years ago across the Near East," Kuijt said. "In many ways food storage is the missing link that helps us understand how so many people were able to live together. And much to our surprise, it appears that they developed this technology at least a 1,000 years before anyone thought they did."

(Photo: U. Notre Dame)

University of Notre Dame


0 comentarios

Astronomy & Astrophysics is publishing the first detection of a magnetic field on the star Vega, one of the brightest stars in the sky. Using the high-sensitivity NARVAL spectropolarimeter installed at the Bernard-Lyot telescope (Pic du Midi Observatory, France), a team of astronomers detected the effect of a magnetic field (known as the Zeeman effect) in the light emitted by Vega.

Vega is a famous star among amateur and professional astronomers. Located at only 25 light years from Earth in the Lyra constellation, it is the fifth brightest star in the sky. It has been used as a reference star for brightness comparisons. Vega is twice as massive as the Sun and has only one tenth its age. Because it is both bright and nearby, Vega has been often studied but it is still revealing new aspects when it is observed with more powerful instruments. Vega rotates in less than a day, while the Sun's rotation period is 27 days. The intense centrifugal force induced by this rapid rotation flattens its poles and generates temperature variations of more than 1000 degrees Celsius between the polar (warmer) and the equatorial regions of its surface. Vega is also surrounded by a disk of dust, in which the inhomogeneities suggest the presence of planets.

This time, astronomers analyzed the polarization of light emitted by Vega and detected a weak magnetic field at its surface. This is really not a big surprise because one knows that the charged particle motions inside stars can generate magnetic fields, and this is how solar and terrestrial magnetic fields are produced. However, for more massive stars than the Sun, such as Vega, theoretical models cannot predict the intensity and the structure of the magnetic field, so that astronomers had no clue to the strength of the signal they were looking for. After many unsuccessful attempts in past decades, both the high sensitivity of NARVAL and the full dedication of an observing campaign to Vega have made this first detection possible.

The strength of Vega magnetic field is about 50 micro-tesla, which is close to that of the mean field on Earth and on the Sun. This first observational constraint opens the way to in-depth theoretical studies about the origin of magnetic fields in massive stars. This detection also suggests that magnetic fields exist but have not been detected yet on many stars like Vega, but farther and more difficult to observe. Astronomers believe that this discovery will be a key step in understanding stellar magnetic fields and their influence on stellar evolution. As for Vega, it is now the prototype of a new class of magnetic stars and will definitely continue fascinating astronomers for years.

(Photo: © Pascal Petit)

Astronomy and Astrophysics




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com