Tuesday, July 21, 2009


0 comentarios

CSIRO astronomers have revealed the hidden face of an enormous galaxy called Centaurus A, which emits a radio glow covering an area 200 times bigger than the full Moon.

The galaxy’s radio waves have been painstakingly transformed into a highly detailed image, which is being unveiled to the public for the first time.

Centaurus A lies 14 million light-years away, in the southern constellation Centaurus, and houses a monster black hole 50 million times the mass of the Sun.

The galaxy’s black hole generates jets of radio-emitting particles that billow millions of light years out into space.

The spectacular sight is invisible to the naked eye.

“If your eyes could see radio waves you would look up in the sky and see the radio glow from this galaxy covering an area 200 times bigger than the full Moon,” said the lead scientist for the project, Dr Ilana Feain of CSIRO’s Australia Telescope National Facility (ATNF).

“Only a small percentage of galaxies are of this kind. They’re like the blue whales of space – huge and rare.”

Seen at radio wavelengths, Centaurus A is so big and bright that no-one else has ever tried making such an image.

“This is the most detailed radio image ever made of Centaurus A, and indeed of any galaxy that produces radio jets,” said Dr Lewis Ball, Acting Director of the ATNF.

“Few other groups in the world have the skills and the facilities to make such an image, and we were the first to try.”

Dr Feain and her team used CSIRO’s Australia Telescope Compact Array telescope near Narrabri, NSW, to observe the galaxy for more than 1200 hours, over several years.

This produced 406 individual images, which were ‘mosaiced’ together to make one large image.

Dr Feain combined the Compact Array data and data taken from CSIRO’s Parkes radio telescope.

Processing the image – combining the data, taking out the effects of radio interference, and adjusting the dynamic range – took a further 10,000 hours.

Astronomers will use the image to help them understand how black holes and radio jets interact with a galaxy’s stars and dust, and how the galaxy has evolved over time.

Centaurus A is the closest of the galaxies with a supermassive black hole producing radio jets, which makes it the easiest to study.

Astronomers are interested in studying more of these rare, massive galaxies to determine the role black holes play in galaxy formation and growth.

Dr Feain said the sample of galaxies we have today is just the tip of the iceberg, because current telescopes don’t combine the sensitivity needed to detect these sources and the ability to survey large areas of sky.

Enter ASKAP, the Australian SKA Pathfinder telescope, a new telescope being developed by CSIRO and partners, located in Western Australia.

ASKAP will be a survey telescope, designed for projects such as hunting for galaxies like Centaurus A in the distant universe. It is a precursor facility for the planned Square Kilometre Array (SKA), the world’s largest radio telescope.

“ASKAP will be incredibly fast,” said Professor Brian Boyle, CSIRO SKA Director. “Gathering the Centaurus A data with the Compact Array took 1200 hours. With ASKAP, it would take five minutes.”

ASKAP is on schedule for completion in 2012. In its first six hours of operation it will generate more information than all previous radio telescopes combined.

Centaurus A was one of the first cosmic radio sources known outside our own Galaxy and it has a special connection with Australia.

The (visible) galaxy was discovered and recorded at Parramatta Observatory near Sydney in 1826. It was later catalogued under the name NGC 5128.

As a radio source, Centaurus A was discovered from Dover Heights in Sydney by CSIRO scientists in 1947.

(Photo: Ilana Feain, Tim Cornwell & Ron Ekers (CSIRO/ATNF), R. Morganti (ASTRON), N. Junkes (MPIfR), Shaun Amy, CSIRO)


0 comentarios

An assistive technology that enables individuals to maneuver a powered wheelchair or control a mouse cursor using simple tongue movements can be operated by individuals with high-level spinal cord injuries, according to the results of a recently completed clinical trial.

"This clinical trial has validated that the Tongue Drive system is intuitive and quite simple for individuals with high-level spinal cord injuries to use,” said Maysam Ghovanloo, an assistant professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology. “Trial participants were able to easily remember and correctly issue tongue commands to play computer games and drive a powered wheelchair around an obstacle course with very little prior training.”

At the annual conference of the Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) on June 26, the researchers reported the results of the first five clinical trial subjects to use the Tongue Drive system. The trial was conducted at the Shepherd Center, an Atlanta-based catastrophic care hospital, and funded by the National Science Foundation and the Christopher and Dana Reeve Foundation.

The clinical trial tested the ability of these individuals with tetraplegia, as a result of high-level spinal cord injuries (cervical vertebrae C3-C5), to perform tasks related to computer access and wheelchair navigation—using only their tongue movements.

At the beginning of each trial, Ghovanloo and graduate students Xueliang Huo and Chih-wen Cheng attached a small magnet—the size of a grain of rice—to the participant’s tongue with tissue adhesive. Movement of this magnetic tracer was detected by an array of magnetic field sensors mounted on wireless headphones worn by the subject. The sensor output signals were wirelessly transmitted to a portable computer, which was carried on the wheelchair.

The signals were processed to determine the relative motion of the magnet with respect to the array of sensors in real-time. This information was then used to control the movements of the cursor on a computer screen or to substitute for the joystick function in a powered wheelchair. Details on use of the Tongue Drive for wheeled mobility were published in the June 2009 issue of the journal IEEE Transactions on Biomedical Engineering.

Ghovanloo chose the tongue to operate the system because unlike hands and feet, which are controlled by the brain through the spinal cord, the tongue is directly connected to the brain by a cranial nerve that generally escapes damage in severe spinal cord injuries or neuromuscular diseases.

Before using the Tongue Drive system, the subjects trained the computer to understand how they would like to move their tongues to indicate different commands. A unique set of specific tongue movements was tailored for each individual based on the user’s abilities, oral anatomy and personal preferences. For the first computer test, the user issued commands to move the computer mouse left and right. Using these commands, each subject played a computer game that required moving a paddle horizontally to prevent a ball from hitting the bottom of the screen.

After adding two more commands to their repertoire—up and down—the subjects were asked to move the mouse cursor through an on-screen maze as quickly and accurately as possible.

Then the researchers added two more commands—single and double mouse clicks—to provide the subject with complete mouse functionality. When a randomly selected symbol representing one of the six commands appeared on the computer screen, the subject was instructed to issue that command within a specified time period. Each subject completed 40 trials for each time period.

After the computer sessions, the subjects were ready for the wheelchair driving exercise. Using forward, backward, right, left and stop/neutral tongue commands, the subjects maneuvered a powered wheelchair through an obstacle course.

The obstacle course contained 10 turns and was longer than a professional basketball court. Throughout the course, the users had to perform navigation tasks such as making a U-turn, backing up and fine-tuning the direction of the wheelchair in a limited space. Subjects were asked to navigate through the course as fast as they could, while avoiding collisions.

Each subject operated the powered wheelchair using two different control strategies: discrete mode, which was designed for novice users, and continuous mode for more experienced users. In discrete mode, if the user issued the command to move forward and then wanted to turn right, the user would have to stop the wheelchair before issuing the command to turn right. The stop command was selected automatically when the tongue returned to its resting position, bringing the wheelchair to a standstill.

"Discrete mode is a safety feature particularly for novice users, but it reduces the agility of the wheelchair movement,” explained Ghovanloo. “In continuous mode, however, the user is allowed to steer the powered wheelchair to the left or right as it is moving forward and backward, thus making it possible to follow a curve.”

Each subject completed the course at least twice using each strategy while the researchers recorded the navigation time and number of collisions. Using discrete control, the average speed for the five subjects was 5.2 meters per minute and the average number of collisions was 1.8. Using continuous control, the average speed was 7.7 meters per minute and the average number of collisions was 2.5.

While this initial performance trial only required six tongue commands, the Tongue Drive system can potentially capture a large number of tongue movements, each of which can represent a different user command. The ability to train the system with as many commands as an individual can comfortably remember and having all of the commands available to the user at the same time are significant advantages over the common sip-n-puff device that acts as a simple switch controlled by sucking or blowing through a straw.

Some sip-n-puff users also consider the straw to be a symbol of their disability. Since Tongue Drive users simply wear headphones that are commonly worn to listen to music, the system is more acceptable to potential users.

John Anschutz, manager of the assistive technology program at the Shepherd Center, identified advantages the Tongue Drive system has over the tongue-touch keypad.

"The Tongue Drive system seems to be much more supportable if there were a failure of some component within the system. With the old tongue-touch keypad, if the system went down then the user lost all of the functions of the wheelchair, phone, computer and environmental control,” explained Anschutz. “Ghovanloo’s approach should be much more repairable should a fault arise, which is critical for systems for which so much function is depended upon.”

A future system upgrade will be to move the sensors inside the user’s mouth, according to Ghovanloo. This will be an important step for users who are very impaired and cannot reposition the system for best results, according to Anschutz.

"All of the subjects successfully completed the computer and powered wheelchair navigation tasks with their tongues without difficulty, which demonstrates that the Tongue Drive system can potentially provide individuals unable to move their arms and hands with effective control over a wide variety of devices they use in their daily lives,” said Ghovanloo.

(Photo: Georgia Tech/Gary Meek)

The Georgia Institute of Technology


0 comentarios

Milder winters are causing Scotland's wild breed of Soay sheep to get smaller, despite the evolutionary benefits of possessing a large body, according to new research published in Science Express.

The new study provides evidence for climate change as the cause of the mysterious decrease in the size of wild sheep on the Scottish island of Hirta, first reported by scientists in 2007. The researchers believe that, due to climate change, survival conditions on Hirta are becoming less challenging, which means slower-growing, smaller sheep are more likely to survive the winters than they once were. This, together with newly-discovered so-called 'young mum effect' whereby young ewes produce smaller offspring, explains why the average size of sheep on the island is decreasing.

Classical evolutionary theory suggests that over time the average size of wild sheep increases, because larger animals tend to be more likely to survive and reproduce than smaller ones, and offspring tend to resemble their parents. However, among the Soay sheep of Hirta, a remote Scottish island in the St Kilda archipelago, average body size has decreased by approximately 5 per cent over the last 24 years.

The research team analysed body size and life history data, which records the timing of key milestones throughout an individual sheep's life, for Soays on Hirta over this 24 year period. They found that sheep on the island are not growing as quickly as they once did, and that smaller sheep are more likely to survive into adulthood. This is bringing down the average size of sheep in the population over all.

Lead author Professor Tim Coulson from Imperial's Department of Life Sciences, suggests that this is because shorter, milder winters, caused by global climate change, mean that lambs do not need to put on as much as weight in the first months of life to survive to their first birthday as they did when winters were colder.

He explains: "In the past, only the big, healthy sheep and large lambs that had piled on weight in their first summer could survive the harsh winters on Hirta. But now, due to climate change, grass for food is available for more months of the year, and survival conditions are not so challenging – even the slower growing sheep have a chance of making it, and this means smaller individuals are becoming increasingly prevalent in the population."

Their results suggest that the decrease in average body size seen in Hirta's sheep is primarily an ecological response to environmental changes over the last 25 years; evolutionary change has contributed relatively little.

In addition, the research team also discovered that the age at which a female sheep gives birth affects the size of her offspring. They realised that young Soay ewes are physically unable to produce offspring that are as big as they themselves were at birth. This 'young-mum' effect had not been incorporated into previous analyses of natural selection, which explains in part why the sheep of Hirta are defying biologists' expectations.

"The young mum effect explains why Soay sheep have not been getting bigger, as we expected them to," concludes Professor Coulson, "But it is not enough to explain why they're shrinking. We believe that this is down to climate change. These two factors are combining to override what we would expect through natural selection."

(Photo: Imperial College London)


0 comentarios

Researchers at the University of Illinois report that a toxic molecule known to damage cells and cause disease may also play a pivotal role in bird migration. The molecule, superoxide, is proposed as a key player in the mysterious process that allows birds to “see” Earth’s magnetic field.

Changes in the electromagnetic field, such as those experienced by a bird changing direction in flight, appear to alter a biochemical compass in the eye, allowing the bird to see how its direction corresponds to north or south. The discovery, reported this month in Biophysical Journal, occurred as a result of a “mistake” made by a collaborator, said principal investigator Klaus Schulten, who holds the Swanlund Chair in Physics at Illinois and directs the theoretical and computational biophysics group at the Beckman Institute for Advanced Science and Technology. His postdoctoral collaborator, Ilia Solov’yov, of the Frankfurt Institute for Advanced Studies, did not know that superoxide was toxic, seeing it instead as an ideal reaction partner in a biochemical process involving the protein cryptochrome in a bird’s eye.

Cryptochrome is a blue-light photoreceptor found in plants and in the eyes of birds and other animals. Schulten was the first to propose (in 2000) that this protein was a key component of birds’ geomagnetic sense, a proposal that was later corroborated by experimental evidence. He made this prediction after he and his colleagues discovered that magnetic fields can influence chemical reactions if the reactions occur quickly enough to be governed by pure quantum mechanics.

“Prior to our work, it was thought that this was impossible because magnetic fields interact so weakly with molecules,” he said. Such chemical reactions involve electron transfers, Schulten said, “which result in freely tumbling spins of electrons. These spins behave like an axial compass.”

Changes in the electromagnetic field, such as those experienced by a bird changing direction in flight, appear to alter this biochemical compass in the eye, allowing the bird to see how its direction corresponds to north or south.

“Other researchers had found that cryptochrome, acting through its own molecular spins, recruits a reaction partner that operates at so-called zero spin. They suggested that molecular oxygen is that partner,” Schulten said. “We propose that the reaction partner is not the benign oxygen molecule that we all breathe, but its close cousin, superoxide, a negatively charged oxygen molecule.”

When Solov’yov showed that superoxide would work well as a reaction partner, Schulten was at first dismissive.

“But then I realized that the toxicity of superoxide was actually crucial to its role,” he said. The body has many mechanisms for reducing concentrations of superoxide to prevent its damaging effects, Schulten said. But this gives an advantage, since the molecule must be present at low concentrations – but not too low – “to make the biochemical compass work effectively,” he said.

Although known primarily as an agent of aging and cellular damage, superoxide recently has been recognized for its role in cellular signaling.

However, its toxicity may also explain why humans do not have the same ability to see Earth’s electromagnetic field, Schulten said.

“Our bodies try to play it safe,” he said. “It might be that human evolution chose longevity over orientational ability.”

(Photo: L. Brian Stauffer)

University of Illinois


0 comentarios

The earliest stars in the universe formed not only as individuals, but sometimes also as twins, according to a paper published in Science Express. By creating robust simulations of the early universe, astrophysicists Matthew Turk and Tom Abel of the Kavli Institute for Particle Astrophysics and Cosmology, located at the Department of Energy’s SLAC National Accelerator Laboratory, and Brian O'Shea of Michigan State University have gained the most detailed understanding to date of the formation of the first stars.

"We used to think that these stars formed by themselves, but now we see from our computer simulations that sometimes they have siblings," said Turk. "These stars provide the seeds of next generation star formation, so by understanding them we can better understand how other stars and galaxies formed."

To make this discovery, the researchers created an extremely detailed computer simulation of early star formation. Into this virtual universe they sprinkled primordial gas and dark matter as it existed soon after the Big Bang, data they obtained from observations of the cosmic microwave background. This mostly uniform radiation—a faint glow of radio waves spread across the entire sky—contains subtle variations that reflect the beginning of all structure in the universe.

Turk, Abel and O'Shea ran five data-intensive simulations, each of which covered a 400 quadrillion cubic mile volume of the universe and took about three weeks to run on 64 processors. The simulations focused on the first Population III stars: massive, hot stars thought to have formed a mere several hundred million years after the Big Bang.

As the researchers watched their simulated universe evolve, waves of gas and dark matter swirled through the hot, dense universe. As the universe cooled, gravity began to draw the matter together into clumps. In areas rich with matter, stars began to form. And, in one out of the researchers' five simulations, a single cloud of dust and dark matter formed into "twin" stars: one with a mass equivalent to about 10 suns, and one with a mass equivalent to about 6.3 suns. Both of them were still growing at the end of the calculation and will likely grow to many times that mass.

"We ran five of these calculations starting from the beginning of the universe, and to our surprise one of them was special," said Abel. "This opens a whole new realm of research possibilities. These stars could evolve into two black holes, which could have created gravitational waves we could detect with an instrument like the Laser Interferometer Gravitational Wave Observatory and, if they fall into bigger black holes, for the Laser Interferometer Space Antenna. Or one of the stars could evolve into a black hole that could create gamma-ray bursts that we could detect with the Swift mission and the Fermi Gamma-ray Space Telescope."

Turk added: "This will help us fine-tune our models for how structure in the universe formed and evolved. Understanding the very early stars helps us understand what we see today. It even helps explain how and when some of the atoms now on earth and in our bodies were first formed."

(Photo: Ralf Kaehler, Matthew Turk and Tom Abel)

SLAC National Accelerator Laboratory
Stanford University


0 comentarios

Through a recent modeling experiment, a team of NASA-funded researchers have found that future concentrations of carbon dioxide and ozone in the atmosphere and of nitrogen in the soil are likely to have an important but overlooked effect on the cycling of water from sky to land to waterways.

The researchers concluded that models of climate change may be underestimating how much water is likely to run off the land and back into the sea as atmospheric chemistry changes. Runoff may be as much as 17 percent higher in forests of the eastern United States when models account for changes in soil nitrogen levels and atmospheric ozone exposure.

"Failure to consider the effects of nitrogen limitation and ozone on photosynthesis can lead us to underestimate regional runoff," said Benjamin Felzer, an ecosystem modeler at Lehigh University in Bethlehem, Pa. "More runoff could mean more contamination and flooding of our waterways. It could also mean fewer droughts than predicted for some areas and more water available for human consumption and farming. Either way, water resource managers need more accurate runoff estimates to plan better for the changes."

Felzer and colleagues from the Massachusetts Institute of Technology (MIT) in Cambridge and the Marine Biology Laboratory in Woods Hole, Mass., published their findings recently in the Journal of Geophysical Research – Biogeosciences.

Plants play a significant role in Earth's water cycle, regulating the amount of water cycling through land ecosystems and how long it stays there. Plants draw in water from the atmosphere and soil, and they discharge it naturally through transpiration, the tail end of photosynthesis when water vapor and oxygen are released into the air.

The amount of water that plants give up depends on how much carbon dioxide is present in the atmosphere. Studies have shown that despite a global drop in rainfall over land in the past 50 years, runoff has actually increased.

Other studies have shown that increasing CO2 is changing how plant "pores," or stomata, discharge water. With elevated CO2 levels, leaf pores contract and sometimes close to conserve internal water reserves. This "stomatal conductance" response increases water use efficiency and reduces the rate of transpiration.

Plants that release less water also take less of it from the environment. With less water being taken up by plants, more water is available for groundwater or runs off the land surface into lakes, streams, and rivers. Along the way, it accumulates excess nutrients and pollutants before emptying into waterways, where it affects the health of fish, algae, and shellfish and contaminate drinking water and beaches. Excess runoff can also contribute to flooding.

Sometimes rising CO2 has the opposite effect, Felzer noted, promoting vegetation growth by increasing the rate of photosynthesis. More plant growth can lead to a thicker canopy of leaves with increased transpiration and less runoff. However, this effect has been shown to be smaller than the effect of reduced stomatal conductance.

Aware of these cycles, Felzer and colleagues used theoretical models to project various future scenarios for the amount of carbon dioxide in the atmosphere and what it would mean to the changing water cycle in forests east of the Mississippi River. They found that runoff would increase anywhere from 3 to 6 percent depending on location and the amount of the increase in CO2.

Felzer and colleagues also examined the role of two other variables -- atmospheric ozone and soil-based nitrogen -- in the changing water cycle. Excess ground-level ozone harms the cells responsible for photosynthesis. Reductions in photosynthesis leads to less transpiration and cycling of water through leaves and more water added to runoff.

In most boreal and temperate forests, the rate of photosynthesis is also limited by the availability of nutrients such as nitrogen in the soil. The less nitrogen in the soil, the slower their rate of photosynthesis and transpiration.

"The increase in runoff is even larger when nitrogen is limited and environments are exposed to high ozone levels," said Felzer. In fact, the team found an additional 7 to 10 percent rise in runoff when nitrogen was limited and ozone exposure increased.

"Though this study focuses on Eastern U.S. forests, we know nitrogen and ozone effects are also important in South America and Europe. One region has seen a net increase and the other a net runoff reduction," said co-author Adam Schlosser of the Center for Global Change Science at MIT. "Our environment and quality of life depend on less uncertainty on this front."

(Photo: NASA)



0 comentarios

Imagine a soldier's uniform made of a special fabric that allows him to look in all directions and identify threats that are to his side or even behind him. In work that could turn such science fiction into reality, MIT researchers have developed light-detecting fibers that, when weaved into a web, act as a flexible camera. Fabric composed of these fibers could be joined to a computer that could provide information on a small display screen attached to a visor, providing the soldier greater awareness of his surroundings.

The researchers, led by Associate Professor Yoel Fink of the Department of Materials Science and Engineering (DMSE), emphasize that while such an application and others like it are still only dreams, work is rapidly progressing on developing fabrics capable of capturing images. In a recent issue of the journal Nanoletters, the team reported what it called a "significant" advance: using such a fiber web to take a rudimentary picture of a smiley face.

"This is the first time that anybody has demonstrated that a single plane of fibers, or 'fabric,' can collect images just like a camera but without a lens," said Fink, corresponding author of the Nanoletters paper. "This work constitutes a new approach to vision and imaging."

Our eyes are a great example of Nature's approach to imaging: they involve a highly sophisticated and localized organ made in part of a delicate lens. Technologists have mimicked this approach in cameras, telescopes and even microscopes.

But lenses of natural or man-made origin have a limited field of view, and are susceptible to damage, leading to the loss of the imaging or seeing capacity altogether. Optical fiber webs, in contrast, provide a distributed imaging capability provided by the entire surface of a fabric, which is in principle much more robust to damage and "blindness." If one area is damaged, other fibers can still function, extracting the image.

"We are saying, 'instead of a tiny, sensitive object [for capturing images], let's construct a large, distributed system,'" said Fink, who is also affiliated with MIT's Research Laboratory of Electronics (RLE), the Center for Materials Science and Engineering (CMSE) and Institute for Soldier Nanotechnologies (ISN).

The new fibers, less than a millimeter in diameter, are composed of layers of light-detecting materials nested one within another.

Those layers include two rings of a semiconductor material that are light sensitive, each ring only 100 billionths of a meter across. Four metal electrodes contact each of the rings, extending along the length of the fiber, for a total of eight. Each semiconductor ring with its attached electrodes is in turn encased in rings of a polymer insulator that separate it from its neighbor.

The team starts with a macroscopic cylinder, or preform, of these elements. That preform is placed into a special furnace that melts the components, carefully drawing them into miniscule fibers that retain the original orientation of the various layers. The process can produce many meters of fiber.

Fink's team demonstrated the power of their approach by placing an object - a smiley face - between a light source and a small swatch of fabric composed of the fibers that was in turn connected to an external amplifying electrical circuit and computer.

The individual fibers measure the intensity of the light illuminating them and convert it to an electrical signal. Importantly, they are also designed to differentiate between light at different wavelengths or colors. A mesh of fibers is then deployed to measure light intensity distribution at different wavelengths across a large area.

In the current work, the smiley face was illuminated with light at two separate wavelengths. This generated a distinct pattern on the fabric mesh that was then fed into a computer. From there, an algorithm described earlier by the Fink team in Nature Materials assimilates the data to create a black-and-white image of the object on a computer screen.

"This paper furthers our vision of designing fiber materials and fabrics with ever-increasing sophistication and complexity," Fink said. He and colleagues note that additional optoelectronic layers in the fibers will lead to crisper images that could even be displayed in color.

"While the current version of these fabrics can only image nearby objects, it can still can see much farther than most shirts can," he said.

(Photo: Fink Lab, MIT)



0 comentarios
Freshwater fish was a staple diet for humans as early as 40,000 years ago, according to a new study led by UBC anthropologist Michael Richards.

Richards and a team of researchers from China and the U.S. analyzed stable isotope ratios found in the collagen of human and animal bones discovered in 2001 from the Tianyuan Cave near Beijing, China. Their study, to be published in this week’s Proceedings of the National Academy of Sciences, presents the first direct evidence of fish being a substantial part of early human diet.

The 34 human bones – likely from the same individual scientists call Tianyuan 1 – discovered in the Tianyuan Cave are one of the oldest modern human remains found in Eurasia. Bones of deer, monkey, porcupine and wild cat were also discovered on the site.

Stable isotopes such as those of carbon, nitrogen and sulfur are often used by scientists to reconstruct a food web and determine the long-term source of dietary proteins as they are deposited in the bone collagen over time.

“We found similar carbon isotope values in the remains of the Tianyuan 1 human and those of a wild cat, indicating a similar diet of land animals and plants as a source of protein,” says Richards, who led the study while a researcher at the Max Planck Institute in Germany.

“But significantly higher values of nitrogen isotopes in the human suggest an additional source of protein – probably from freshwater fish since aquatic animals generally have higher nitrogen isotope values due to their longer food chains.”

Since no freshwater fish bones were found in Tianyuan Cave, the researchers studied fish remains discovered in the nearby Donghulin site and found consistent sulfur isotope ratios.

“The combination of these findings provides direct evidence that early modern humans in China consumed freshwater fish regularly,” says Richards.

Anthropologists have previously found indirect evidence of humans eating fish in South Africa as early as 50-60 thousand years ago. This new study presents the first direct evidence of fish being an important part of their diet and may help scientists understand how early modern humans adapted to their environments.

“Since this timeframe predates consistent evidence of effective fishing gear, we think the shift to more fish in the diet may reflect greater pressure from an expanding population and the resulting difficulty in accessing small land animals.”



0 comentarios
Tiny flying machines can be used for everything from indoor surveillance to exploring collapsed buildings, but simply making smaller versions of planes and helicopters doesn't work very well. Instead, researchers at North Carolina State University are mimicking nature's small flyers – and developing robotic bats that offer increased maneuverability and performance.

Small flyers, or micro-aerial vehicles (MAVs), have garnered a great deal of interest due to their potential applications where maneuverability in tight spaces is necessary, says researcher Gheorghe Bunget. For example, Bunget says, "due to the availability of small sensors, MAVs can be used for detection missions of biological, chemical and nuclear agents." But, due to their size, devices using a traditional fixed-wing or rotary-wing design have low maneuverability and aerodynamic efficiency.

So Bunget, a doctoral student in mechanical engineering at NC State, and his advisor Dr. Stefan Seelecke looked to nature. "We are trying to mimic nature as closely as possible," Seelecke says, "because it is very efficient. And, at the MAV scale, nature tells us that flapping flight – like that of the bat – is the most effective."

The researchers did extensive analysis of bats' skeletal and muscular systems before developing a "robo-bat" skeleton using rapid prototyping technologies. The fully assembled skeleton rests easily in the palm of your hand and, at less than 6 grams, feels as light as a feather. The researchers are currently completing fabrication and assembly of the joints, muscular system and wing membrane for the robo-bat, which should allow it to fly with the same efficient flapping motion used by real bats.

"The key concept here is the use of smart materials," Seelecke says. "We are using a shape-memory metal alloy that is super-elastic for the joints. The material provides a full range of motion, but will always return to its original position – a function performed by many tiny bones, cartilage and tendons in real bats."

Seelecke explains that the research team is also using smart materials for the muscular system. "We're using an alloy that responds to the heat from an electric current. That heat actuates micro-scale wires the size of a human hair, making them contract like 'metal muscles.' During the contraction, the powerful muscle wires also change their electric resistance, which can be easily measured, thus providing simultaneous action and sensory input. This dual functionality will help cut down on the robo-bat's weight, and allow the robot to respond quickly to changing conditions – such as a gust of wind – as perfectly as a real bat."

In addition to creating a surveillance tool with very real practical applications, Seelecke says the robo-bat could also help expand our understanding of aerodynamics. "It will allow us to do tests where we can control all of the variables – and finally give us the opportunity to fully understand the aerodynamics of flapping flight," Seelecke says.

North Carolina State University




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com