Tuesday, November 9, 2010


0 comentarios
Scientists have discovered that bees learn to fly the shortest possible route between flowers even if they discover the flowers in a different order. Bees are effectively solving the 'Travelling Salesman problem', and these are the first animals found to do this. This research could inspire improvements to networks such as traffic on the roads, information flow on the web and business supply chains. By understanding how bees can solve their problem with such a tiny brain we can improve our management of these everyday networks without needing lots of computer time.

This work brings together researchers from both life and physical sciences to study how nature's computers - brains - process complex tasks. Studying an insect with a tiny brain and comparatively great cognitive abilities gives researchers insight into the minimum circuitry required for solving difficult mathematical problems.

The teams from Royal Holloway, University of London and Queen Mary, University of London are funded by BBSRC, the Wellcome Trust and the Engineering and Physical Sciences Research Council following a call for multidisciplinary research proposals aimed at taking forward research in cognitive systems.

The research, due to be published in 'The American Naturalist', also gives an insight into bumblebee behaviour. Bumblebees play a vital role in pollinating certain food crops and so an understanding of how they forage is important for future food security.

The Travelling Salesman must find the shortest route that allows him to visit all locations on his route.

Dr Nigel Raine, from the School of Biological Sciences at Royal Holloway explains: "Foraging bees solve travelling salesman problems every day. They visit flowers at multiple locations and, because bees use lots of energy to fly, they find a route which keeps flying to a minimum."

Computers solve this problem by comparing the length of all possible routes and choosing the shortest. However, bees solve it without computer assistance using a relatively tiny number of brain cells.

Professor Lars Chittka from Queen Mary's School of Biological and Chemical Sciences said: "In nature, bees have to link hundreds of flowers in a way that minimises travel distance, and then reliably find their way home - not a trivial feat if you have a brain the size of a pinhead! Indeed such travelling salesmen problems keep supercomputers busy for days. Studying how bee brains solve such challenging tasks might allow us to identify the minimal neural circuitry required for complex problem solving."

Co-author, Dr Mathieu Lihoreau adds: "There is a common perception that smaller brains constrain animals to be simple reflex machines. But our work with bees shows advanced cognitive capacities with very limited neuron numbers. There is an urgent need to understand the neuronal hardware underpinning animal intelligence, and relatively simple nervous systems such as those of insects make this mystery more tractable."

The team used computer controlled artificial flowers to test whether bees would follow a route defined by the order in which they discovered the flowers or if they would find the shortest route. After exploring the location of the flowers, bees quickly learned to fly the shortest route.

Dr Raine adds: "Despite their tiny brains bees are capable of extraordinary feats of behaviour. We need to understand how they can solve the Travelling Salesman Problem without a computer. What short-cuts do they use?"

Biotechnology and Biological Sciences Research Council


0 comentarios

Researchers at the University of Bristol reveal in the journal Nature that they have developed a seismological ‘speed gun’ for the inside of the Earth.

Using this technique they will be able to measure the way the Earth's deep interior slowly moves around. This mantle motion is what controls the location of our continents and oceans, and where the tectonic plates collide to shake the surface we live on.
For 2,900 km (1800 miles) beneath our feet, the Earth is made of the rocky mantle. Although solid, it is so hot that it can flow like putty over millions of years. It is heated from below, so that it circulates like water on a stove. While geophysicists know something about how the material moves by the time it reaches the top of the mantle, what goes on at the bottom is still a puzzle. However researchers need to know both to predict how the Earth’s surface—our home—will behave.

Andy Nowacki, at the School of Earth Sciences at Bristol University, explained: “The only way to measure the inside of the Earth at such huge depths is with seismic waves. When a large earthquake occurs and waves travel through the Earth, they are affected in different ways, and we can examine their properties to work out what is happening thousands of miles beneath our feet, a region where we can never go. This study focusses on a mysterious layer where the mantle meets the core, a sphere of iron at the centre of the Earth 7,000 km (4400 miles) across. This part just above the core has curious properties which we can measure using seismic waves that pass through it.”

This enigmatic part of the Earth is known as D″ (pronounced ‘dee-double-prime’). Dr James Wookey said: “We believe that D″ is made from crystals which line up in a certain orientation when the mantle flows. We can measure how they line up, and in this study we do this for one part of the world – North and Central America. In the future our method can then be used to see which direction the mantle is moving everywhere.”

Professor Mike Kendall added: “This part of the Earth is incredibly important. The lowermost mantle is where two colossal, churning engines—the mantle and the core—meet and interact. The core is moving very quickly and creates our magnetic field which protects us from the Sun’s rays. The mantle above is sluggish, but drives the motion of the plates on the Earth’s surface, which build mountains, feed volcanoes and cause earthquakes. Measuring the flow in the lowermost mantle is vital to understanding the long term evolution of the Earth.”

(Photo: Bristol U.)

University of Bristol


0 comentarios

Cornell archaeologists are helping to rewrite the early prehistory of human civilization on Cyprus, with evidence that hunter-gatherers began to form agricultural settlements on the island half a millennium earlier than previously believed.

Beginning with pedestrian surveys of promising sites in 2005, students have assisted with fieldwork on Cyprus led by professor of classics Sturt Manning, director of Cornell's archaeology program. The project, Elaborating the Early Neolithic on Cyprus (EENC), has involved undergraduate and graduate students from Cornell, the University of Toronto and the University of Cyprus.

Their findings were published recently in the leading archaeological journal Antiquity, after being reported to Cyprus' Department of Antiquities and presented at an annual archaeological conference there.

"Up until two decades ago, nobody thought anybody had gone to Cyprus before about 8,000 years ago, and the island was treated as irrelevant to the development of the Neolithic in the Near East," Manning said. "Then Alan Simmons (now at the University of Nevada, Las Vegas) discovered a couple of sites that seemed to suggest Epipaleolithic peoples went there maybe about 12,000 or 13,000 years ago, much earlier than anyone had thought possible. The big question started to become in the field, well, what happened in between?"

Subsequent finds pushed the Neolithic evidence on Cyprus back to around 10,000 years ago, but "no one has been able to fill in a 2,000-year gap between this possible first evidence of humans ever going near the island and apparent evidence of proper settlement and farming and agriculture," Manning said.

Based on their survey work since 2006, Manning and colleagues focused efforts on a potentially very early Neolithic site in central Cyprus at Ayia Varvara Asprokremnos (AVA).

"We found this site by doing the opposite of the normal strategy -- people had been looking around the coast," Manning said. "The coast around 11,000 years ago basically is now 50 to a couple hundred meters offshore from the present coastline, because sea level has risen. We [said we] should go inland, and look at the type of place that a hunter-gatherer on the island might try to be a hunter-gatherer or an incipient agriculturalist."

The AVA site "had early Holocene soils, was near the key resources for a human population about 11,000 years ago, and [our surveys] produced lots of evidence of stone tool production," he said. "It was right in the bend of the only permanent river in this whole area of Cyprus, so it seemed to be a perfect strategic spot for an early hunter-gatherer."

There was chert nearby to make stone tools, and hand augur tests found intact soil samples and a single small lithic flake "we thought to be of the right technology to be very early in date," Manning said.

During seasons of fieldwork in 2007, 2008 and 2009, the team excavated several hundred square meters of the site, and intensively surveyed the surrounding area. Six different charcoal samples from the excavations were carbon-dated and securely estimated to be from the Pre-Pottery Neolithic A period, the initial phase of the Near Eastern Neolithic -- "the very origins of the agricultural revolution," Manning said.

"The dates came out to be almost 11,000 years old from today, so we're talking the earlier ninth millennium B.C. … which puts them around half a millennium earlier than any other Neolithic that's ever been recognized or claimed and dated on the island of Cyprus," he said. "More dramatically, these dates mean that Cyprus, an island tens of miles off the Levantine coast, was involved in the very early Neolithic world, and thus long-distance sea travel and maritime communication must now be actively factored into discussions of how the Neolithic developed and spread."

Manning terms the results "part of a field reassessment -- these findings, Cyprus and the maritime component to the development of the Neolithic will now all have to be part of the conversation. These and other findings may change how prehistory is taught at universities and colleges."

(Photo: Cornell U.)

Cornell University


0 comentarios

Move over, touchpad screens: New research funded in part by the National Institutes of Health shows that it is possible to manipulate complex visual images on a computer screen using only the mind.

The study, published in Nature, found that when research subjects had their brains connected to a computer displaying two merged images, they could force the computer to display one of the images and discard the other. The signals transmitted from each subject's brain to the computer were derived from just a handful of brain cells.

"The subjects were able to use their thoughts to override the images they saw on the computer screen," said the study's lead author, Itzhak Fried, M.D., Ph.D., a professor of neurosurgery at the University of California, Los Angeles. The study was funded in part by the National Institute of Neurological Disorders and Stroke (NINDS), and the National Institute of Mental Health (NIMH), both part of NIH.

The study reflects progress in the development of brain-computer interfaces (BCIs), devices that allow people to control computers or other devices with their thoughts. BCIs hold promise for helping paralyzed individuals to communicate or control prosthetic limbs. But in this study, BCI technology was used mostly as a tool to understand how the brain processes information, and especially to understand how thoughts and decisions are shaped by the collective activity of single brain cells.

"This is a novel and elegant use of a brain-computer interface to explore how the brain directs attention and makes choices," said Debra Babcock, M.D., Ph.D., a program director at NINDS.

The study involved 12 people with epilepsy who had fine wires implanted in their brains to record seizure activity. Recordings like these are routinely used to locate areas of the brain that are responsible for seizures. In this study, the wires were inserted in the medial temporal lobe, a brain region important for memory and the ability to recognize complex images, including faces.

While the recordings from their brains were transmitted to a computer, the research subjects viewed two pictures superimposed on a computer screen, each picture showing a familiar object, place, animal or person. They were told to select one image as a target and to focus their thoughts on it until that image was fully visible and the other image faded away. The monitor was updated every one-tenth of one second based on the input from the brain recordings.

As a group, the subjects attempted this game nearly 900 times in total, and were able to force the monitor to display the target image in 70 percent of these attempts. Subjects tended to learn the task very quickly, and often were successful on the first try.

The brain recordings and the input to the computer were based on the activity of just four cells in the temporal lobe. Prior research has shown that individual cells in this part of the brain respond preferentially — firing impulses at a higher rate — to specific images. For instance, one cell in the temporal lobe might respond to seeing a picture of Marilyn Monroe, while another might respond to Michael Jackson. Both were among the celebrity faces used in the study.

Dr. Fried's team first identified four brain cells with preferences for celebrities or familiar objects, animals or landmarks, and then targeted the recording electrodes to those cells. The team found that when subjects played the image-switching game, their success appeared to depend on their ability to power up cells that preferred the target image and suppress cells that preferred the non-target image.

"The remarkable aspects of this study are that we can concentrate our attention to make a choice by modulating so few brain cells and that we can learn to control those cells very quickly,”" said Dr. Babcock.

Prior studies on BCIs have shown that it is possible to perform other tasks, such as controlling a computer cursor, with just a few brain cells. However, the task here was more complex and might have been expected to involve legions of cells in diverse brain areas needed for vision, attention, memory and decision-making.

(Photo: NIH)

National Institutes of Health


0 comentarios

Astronomers using the National Science Foundation's Green Bank Telescope (GBT) have discovered the most massive neutron star yet found, a discovery with strong and wide-ranging impacts across several fields of physics and astrophysics.

"This neutron star is twice as massive as our Sun. This is surprising, and that much mass means that several theoretical models for the internal composition of neutron stars now are ruled out," said Paul Demorest, of the National Radio Astronomy Observatory (NRAO). "This mass measurement also has implications for our understanding of all matter at extremely high densities and many details of nuclear physics," he added.

Neutron stars are the superdense "corpses" of massive stars that have exploded as supernovae. With all their mass packed into a sphere the size of a small city, their protons and electrons are crushed together into neutrons. A neutron star can be several times more dense than an atomic nucleus, and a thimbleful of neutron-star material would weigh more than 500 million tons. This tremendous density makes neutron stars an ideal natural "laboratory" for studying the most dense and exotic states of matter known to physics.

The scientists used an effect of Albert Einstein's theory of General Relativity to measure the mass of the neutron star and its orbiting companion, a white dwarf star. The neutron star is a pulsar, emitting lighthouse-like beams of radio waves that sweep through space as it rotates. This pulsar, called PSR J1614-2230, spins 317 times per second, and the companion completes an orbit in just under nine days. The pair, some 3,000 light-years distant, are in an orbit seen almost exactly edge-on from Earth. That orientation was the key to making the mass measurement.

As the orbit carries the white dwarf directly in front of the pulsar, the radio waves from the pulsar that reach Earth must travel very close to the white dwarf. This close passage causes them to be delayed in their arrival by the distortion of spacetime produced by the white dwarf's gravitation. This effect, called the Shapiro Delay, allowed the scientists to precisely measure the masses of both stars.

"We got very lucky with this system. The rapidly-rotating pulsar gives us a signal to follow throughout the orbit, and the orbit is almost perfectly edge-on. In addition, the white dwarf is particularly massive for a star of that type. This unique combination made the Shapiro Delay much stronger and thus easier to measure," said Scott Ransom, also of NRAO.

The astronomers used a newly-built digital instrument called the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), attached to the GBT, to follow the binary stars through one complete orbit earlier this year. Using GUPPI improved the astronomers' ability to time signals from the pulsar severalfold.

The researchers expected the neutron star to have roughly one and a half times the mass of the Sun. Instead, their observations revealed it to be twice as massive as the Sun. That much mass, they say, changes their understanding of a neutron star's composition. Some theoretical models postulated that, in addition to neutrons, such stars also would contain certain other exotic subatomic particles called hyperons or condensates of kaons.

"Our results rule out those ideas," Ransom said.

Demorest and Ransom, along with Tim Pennucci of the University of Virginia, Mallory Roberts of Eureka Scientific, and Jason Hessels of the Netherlands Institute for Radio Astronomy and the University of Amsterdam, reported their results in the October 28 issue of the scientific journal Nature.

Their result has further implications, outlined in a companion paper, scheduled for publication in the Astrophysical Journal Letters. "This measurement tells us that if any quarks are present in a neutron star core, they cannot be 'free,' but rather must be strongly interacting with each other as they do in normal atomic nuclei," said Feryal Ozel of the University of Arizona, lead author of the second paper.

There remain several viable hypotheses for the internal composition of neutron stars, but the new results put limits on those, as well as on the maximum possible density of cold matter.

The scientific impact of the new GBT observations also extends to other fields beyond characterizing matter at extreme densities. A leading explanation for the cause of one type of gamma-ray burst -- the "short-duration" bursts -- is that they are caused by colliding neutron stars. The fact that neutron stars can be as massive as PSR J1614-2230 makes this a viable mechanism for these gamma-ray bursts.

Such neutron-star collisions also are expected to produce gravitational waves that are the targets of a number of observatories operating in the United States and Europe. These waves, the scientists say, will carry additional valuable information about the composition of neutron stars.

"Pulsars in general give us a great opportunity to study exotic physics, and this system is a fantastic laboratory sitting out there, giving us valuable information with wide-ranging implications," Ransom explained. "It is amazing to me that one simple number -- the mass of this neutron star -- can tell us so much about so many different aspects of physics and astronomy," he added.

(Photo: Bill Saxton, NRAO/AUI/NSF)

National Radio Astronomy Observatory


1 comentarios

An international team of researchers, including a physical anthropology professor at Washington University in St. Louis, has discovered well-dated human fossils in southern China that markedly change anthropologists perceptions of the emergence of modern humans in the eastern Old World.

The research, based at the Institute of Vertebrate Paleontology and Paleoanthropology in Beijing, was published Oct. 25 in the online early edition of the Proceedings of the National Academy of Sciences.

The discovery of early modern human fossil remains in the Zhirendong (Zhiren Cave) in south China that are at least 100,000 years old provides the earliest evidence for the emergence of modern humans in eastern Asia, at least 60,000 years older than the previously known modern humans in the region.

“These fossils are helping to redefine our perceptions of modern human emergence in eastern Eurasia, and across the Old World more generally,” says Eric Trinkaus, PhD, the Mary Tileston Hemenway Professor in Arts & Sciences and professor of physical anthropology.

The Zhirendong fossils have a mixture of modern and archaic features that contrasts with earlier modern humans in east Africa and southwest Asia, indicating some degree of human population continuity in Asia with the emergence of modern humans.

The Zhirendong humans indicate that the spread of modern human biology long preceded the cultural and technological innovations of the Upper Paleolithic and that early modern humans co-existed for many tens of millennia with late archaic humans further north and west across Eurasia.

(Photo: WUSTL)

Washington University in St. Louis


0 comentarios

Research underway at the Georgia Tech Research Institute (GTRI) could enable fixed-wing jet aircraft to take off and land at steep angles on short runways, while also reducing engine noise heard on the ground.

Airplanes of this type -- called cruise-efficient, short take-off and landing (CESTOL) aircraft -- could use runways at much smaller airports, allowing expansion of commercial jet service to many more locations.

Enabling commercial jets to take off and land in ever-shorter distances is an ongoing goal for aircraft designers, and several approaches are under development. GTRI's research could result in a CESTOL aircraft comparable to a Boeing 737 in size, with a similar ability to carry 100 passengers at up to 600 miles per hour.

"To take off or land on a short runway, an aircraft needs to be able to fly very slowly near the runway," said Robert J. Englar, a principal research engineer who is leading the GTRI effort. "The problem is that flying slowly decreases the lift available for taking off and landing. What's needed is a powered-lift approach that combines low air speed with the increased lift capability required for successful CESTOL operation."

The work is part of the NASA Hybrid Wing-Body Low-Noise ESTOL Program. This four-year program, funded by NASA and led by California Polytechnic State University, includes GTRI and several other team members. GTRI's current work involves leadership of the aerodynamic and acoustic design for the program, along with development of large-scale models that will be used for wind-tunnel testing at government facilities.

At the heart of GTRI's powered-lift design is circulation control wing -- also known as blown-wing -- technology. In this type of system, high-speed jets of air are directed over the upper surface of the wings during take-off and landing, creating an unprecedented lift capability.

"Our design has to incorporate several trade-offs, yet the entire wing-engine powered-lift system has to perform all of its functions well," said Englar, who leads the aerodynamics portion of GTRI's work.

Specifically, he said, the new design must:

• Generate a high degree of lift on take-off and landing to allow short ground rolls and steep climb-out or approach flight angles;

• Yield lower drag at cruising speeds to achieve good fuel efficiency;

• Simplify the wing and downsize it for more-efficient cruise performance;

• Produce noise levels that are lower than a conventional passenger jet;

• Be less complex overall than conventional designs.

To satisfy those requirements, the GTRI team placed turbo-fan engines above the wing of the conceptual CESTOL aircraft, rather than below the wing as on most commercial aircraft, explained Rick Gaeta, a former GTRI senior research engineer who had led the acoustic portion of the research.

Over-the-wing placement is a key design element because it enables very high lift while still providing the engine thrust necessary for take-off and high-speed level flight. It also offers important reduced-noise benefits.

Based on this engine placement, the team's powered-lift design maximizes performance using several interrelated elements:

In most fixed-wing aircraft, Englar explains, the upper surface of the wing is curved. That curvature forces air to flow faster over the top of the wing, which reduces pressure on the upper surface of the wing, increasing wing lift. Mechanical flaps increase aft curvature, enlarging the wing during take-off and landing, and augmenting lift by deflecting the ambient wind stream flowing over the wing.

But the lift generated by conventional wings isn't sufficient for the low flight speeds and steep ascents and descents required by CESTOL aircraft. The essential element in such extreme lift is circulation control / blown-wing technology. This approach can far exceed mechanical flaps in achieving high lift coefficient (a lift coefficient is a number that relates an aircraft's total lift to its wing area and flight speed).

The GTRI team has designed a blown wing that is relatively simple mechanically. Unlike a conventional wing, which uses multiple flap elements, GTRI's design uses only one small, relatively simple flap.

However, that single wing flap is used in tandem with a novel element based on circulation-control technology. A narrow slot, capable of pneumatically blowing out air, runs along the entire trailing edge of each wing, just above the flap. This system is powered by its own compressed air source located inside the wing.

The wing flap, which forms a sharp trailing edge during level flight to reduce drag, rotates downward on take-off and landing. When thus rotated, it forms a highly curved aft surface; then air from the slot can be blown over that curved surface to generate high lift.

This procedure, called flap-blowing, performs two functions: it increases air velocity over the top of the wing, and it deflects the ambient wind stream downward so that it curls under the wing. The combined forces generate a lift coefficient that can be two to four times higher than a conventional mechanical flap.

To achieve even higher lift than flap-blowing alone, the GTRI design takes advantage of an additional phenomenon -- the interaction between the air coming from the wing slot and the exhaust of the plane's over-the-wing jet engines.

During take-off and landing, air flow from the slot interacts with the engine exhaust and pulls this powerful exhaust blast down onto the wing. This process, called entraining the exhaust, greatly increases the velocity of the air passing over the wing and results in highly augmented upward suction and lift.

"This strategy allows an aircraft to be flying at a very low speed, while the wing is seeing much higher relative wind speeds on its curved upper surface due to this blowing and thrust-entraining combination," Englar said. "We have measured lift coefficients between 8.0 and 10.0 on these pneumatic powered-lift wings at a level flight condition during testing. The normal lift coefficient on a conventional wing at a similar flight condition is less than 1.0."

The benefit of an above-the-wing engine configuration is not limited to providing good short takeoff and landing (STOL) performance. It also provides two potential sources of noise reduction: engine-noise shielding and reduced noise footprint in the community.

Gaeta explains the noise-shielding issue by noting that today's commercial jets have their engines under the wings. During take-off and approach, a great deal of noise from these jets propagates downward unimpeded, while engine sound that does travel upward bounces off the wing and then reflects downward.

"By putting the noise source above the wing, there is the potential to shield the ground from engine noise, at least partially," Gaeta said.

The critical design choice in noise shielding involves where to place the engine relative to the wing, he explained. Closer to the wing helps take-off and landing performance, but it increases noise due to viscous rubbing of the jet exhaust stream acting along the wing upper surface. Further away from the wing is better from a noise perspective, but not as effective for take-off and landing performance.

Finally, to the extent that placing the engine above the wing can shield exhaust noise, the engine needs to be placed as far forward as possible because maximum jet noise occurs at the exhaust exit, Gaeta said. Moreover, all of these design choices must not detract from the crucial issue of cruise performance.

The very nature of a STOL flight trajectory -- steep takeoff and approach angles -- offers another potential noise benefit. This trajectory keeps much of the offending noise closer to the airport environs.

Explained Gaeta: "By virtue of steeper takeoff and approach angles, the STOL aircraft can potentially keep its most offending noise within the airport boundary because it is farther from the ground when it passes over communities."

(Photo: GIT)

Georgia Tech Research Institute


0 comentarios

Researchers from Imperial College London, the University of Michigan and Instituto Superior Téchnico Lisbon describe a tabletop instrument that produces synchrotron X-rays, whose energy and quality rivals that produced by some of the largest X-ray facilities in the world.

Scientific and medical advances often depend on the development of better diagnostic and analytical tools, to enable more and more precise investigations at higher and higher resolutions. The development and use of high energy light sources to probe the details of a wide range of materials for research and commercial purposes is a rapidly growing area of science and engineering. However, high power, high quality X-ray sources are typically very large and very expensive. For example, the Diamond Light Source synchrotron facility in Didcot, UK, is 0.5km in circumference and cost £263M to build.

The researchers behind today's study have demonstrated that they can replicate much of what these huge machines do, but on a tabletop. Their micro-scale system uses a tiny jet of helium gas and a high power laser to produce an ultrashort pencil-thin beam of high energy and spatially coherent X-rays.

"This is a very exciting development," said Dr Stefan Kneip, lead author on the study from the Department of Physics at Imperial College London. "We have taken the first steps to making it much easier and cheaper to produce very high energy, high quality X-rays. Extraordinarily, the inherent properties of our relatively simple system generates, in a few millimetres, a high quality X-ray beam that rivals beams produced from synchrotron sources that are hundreds of metres long. Although our technique will not now directly compete with the few large X-ray sources around the world, for some applications it will enable important measurements which have not been possible until now."

The X-rays produced from the new system have an extremely short pulse length. They also originate from a small point in space, about 1 micron across, which results in a narrow X-ray beam that allows researchers to see fine details in their samples. These qualities are not readily available from other X-ray sources and so the researchers’ system could increase access to, or create new opportunities in, advanced X-ray imaging. For example, ultra short pulses allow researchers to measure atomic and molecular interactions that occur on the femtosecond timescale. A femtosecond is one quadrillionth of a second.

Dr Zulfikar Najmudin, the leader of the experimental team from the Department of Physics at Imperial College, added: "We think a system like ours could have many uses. For example, it could eventually increase dramatically the resolution of medical imaging systems using high energy X-rays, as well as enable microscopic cracks in aircraft engines to be observed more easily. It could also be developed for specific scientific applications where the ultrashort pulse of these X-rays could be used by researchers to 'freeze' motion on unprecedentedly short timescales ."

To create their new X-ray system, the research team carried out an experiment at the Center for Ultrafast Optical Science at the University of Michigan that is conceptually simple, but required state-of-the-art laser facilities. They shone the very high power laser beam, named HERCULES, into a jet of helium gas to create a tiny column of ionised helium plasma. In this plasma, the laser pulse creates an inner bubble of positively charged helium ions surrounded by a sheath of negatively charged electrons.

Due to this charge separation, the plasma bubble has powerful electric fields that both accelerate some of the electrons in the plasma to form an energetic beam and also make the beam 'wiggle'. As the electron beam wiggles it produces a highly collimated co-propagating X -ray beam which was measured in these experiments.

This process is similar to what happens in other synchrotron sources, but on a microscopic scale. The acceleration and X-ray production happens over less than a centimetre with the whole tabletop X-ray source housed in a vacuum chamber that is approximately 1 metre on each side. This miniaturisation leads to a potentially much cheaper source of high quality X-rays. It also results in the unique properties of these short bright flashes of X-rays.

In the new study, the researchers describe, for the first time, the technical characteristics of the beam and present test images that demonstrate its performance.

Dr Najmudin concluded: "Our technique can now be used to produce detailed X-ray images. We are currently developing our equipment and our understanding of the generation mechanisms so that we can increase the repetition rate of this X-ray source. High power lasers are currently quite difficult to use and expensive, which means we’re not yet at a stage when we could make a cheap new X-ray system widely available. However, laser technology is advancing rapidly, so we are optimistic that in a few years there will be reliable and easy to use X-ray sources available that exploit our findings."

(Photo: ICL)

Imperial College London


0 comentarios

Two strains of the type of mosquito responsible for the majority of malaria transmission in Africa have evolved such substantial genetic differences that they are becoming different species, according to researchers behind two new studies published in the journal Science.

Over 200 million people globally are infected with malaria, according to the World Health Organisation, and the majority of these people are in Africa. Malaria kills one child every 30 seconds.

Today's international research effort, co-led by scientists from Imperial College London, looks at two strains of the Anopheles gambiae mosquito, the type of mosquito primarily responsible for transmitting malaria in sub-Saharan Africa. These strains, known as M and S, are physically identical. However, the new research shows that their genetic differences are such that they appear to be becoming different species, so efforts to control mosquito populations may be effective against one strain of mosquito but not the other.

The scientists argue that when researchers are developing new ways of controlling malarial mosquitoes, for example by creating new insecticides or trying to interfere with their ability to reproduce, they need to make sure that they are effective in both strains.

The authors also suggest that mosquitoes are evolving more quickly than previously thought, meaning that researchers need to continue to monitor the genetic makeup of different strains of mosquitoes very closely, in order to watch for changes that might enable the mosquitoes to evade control measures in the future.

Dr George Christophides, one of the lead researchers behind today's work from the Division of Cell and Molecular Biology at Imperial College London, said: "Malaria is a deadly disease that affects millions of people across the world and amongst children in Africa, it causes one in every five deaths. We know that the best way to reduce the number of people who contract malaria is to control the mosquitoes that carry the disease. Our studies help us to understand the makeup of the mosquitoes that transmit malaria, so that we can find new ways of preventing them from infecting people."

Dr Mara Lawniczak, another lead researcher from the Division of Cell and Molecular Biology at Imperial College London, added: "From our new studies, we can see that mosquitoes are evolving more quickly than we thought and that unfortunately, strategies that might work against one strain of mosquito might not be effective against another. It's important to identify and monitor these hidden genetic changes in mosquitoes if we are to succeed in bringing malaria under control by targeting mosquitoes."

The researchers reached their conclusions after carrying out the most detailed analysis so far of the genomes of the M and S strains of Anopheles gambiae mosquito, over two studies. The first study, which sequenced the genomes of both strains, revealed that M and S are genetically very different and that these genetic differences are scattered around the entire genome. Previous studies had only detected a few 'hot spots' of divergence between the genomes of the two strains. The work suggested that many of the genetic regions that differ between the M and S genomes are likely to affect mosquito development, feeding behaviour, and reproduction.

In the second study, the researchers looked at many individual mosquitoes from the M and S strains, as well as a strain called Bamako, and compared 400,000 different points in their genomes where genetic variations had been identified, to analyse how these mosquitoes are evolving. This showed that the strains appear to be evolving differently, probably in response to factors in their specific environments - for example, different larval habitats or different pathogens and predators. This study was the first to carry out such detailed genetic analysis of an invertebrate, using a high density genotyping array.

As a next step in their research, the Imperial researchers are now carrying out genome-wide association studies of mosquitoes, using the specially designed genotyping chip that they designed for their second study, to explore which variations in mosquito genes affect their propensity to become infected with malaria and other pathogens.

(Photo: ICL)

Imperial College London




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com