Thursday, November 18, 2010

DO HOLES MAKE MOLES?

0 comentarios
The mysterious origins of Australia's bizarre and secretive marsupial moles have been cast in a whole new and unexpected light with the first discovery in the fossil record of one of their ancestors.

The find reveals a remarkable journey through time, place and lifestyle: living marsupial moles are blind, earless and live underground in the deserts of the Northern Territory, Western Australia and South Australia, yet their ancestors lived in lush rainforest far away in north Queensland.

In the journal Proceedings of the Royal Society B, a team led by Professor Mike Archer, of the University of New South Wales (UNSW), reports the discovery of the remarkable 20 million-year-old fossil at the Riversleigh World Heritage fossil site.

Although related to kangaroos, koalas and other marsupials, living marsupial moles far more closely resemble Cape golden moles, which burrow through the desert sands of Africa. The two golden-furred animals not only look indistinguishable when seen side by side but share many other similarities in their teeth and skeletons that reflect their subterranean lifestyles.

Yet the Cape golden mole is a placental mammal - the group that includes rats, bats, elephants and humans – and these two very different branches of the mammal family evolved from a common ancestor at least 125 million years ago, says Professor Archer. Having diverged in ancestry, however, their similar lifestyles have meant that they have converged in anatomy.

"This fossil discovery came as a real shock," he says. "Until now, we had always assumed that marsupial moles must have evolved in an unknown ancient Australian desert because, like Cape golden moles, the living marsupial moles survive only in deserts.

"Yet this ancestral Australian mole, which is not as specialised as the living form, has been discovered in ancient rainforest deposits—not deserts. The fossils suggest that they became mole-like while burrowing through the mossy floors of those ancient forests."

This missing link has solved a second mystery about how the highly specialised V-shaped teeth of the living marsupial mole evolved. Although they are almost identical to the teeth of their African counterparts, it is now clear that they went down a completely different evolutionary pathway to get there, says co-author Dr Robin Beck of the American Museum of Natural History.

"This ancient link makes it clear that marsupials followed a completely different path from placentals but ended up with almost identical-looking teeth."

Co-author UNSW Associate Professor Suzanne Hand said: "It goes to the heart of global debates about relationship versus convergence - whether animals are similar because they are closely related or similar because they have had to adapt to related challenges. It's also exciting because it so beautifully demonstrates just how adaptive Australian marsupials can be when given the right evolutionary challenges and enough time to meet them."

University of New South Wales

THE ZEBRAFISH'S NEURAL CIRCUIT PREVENTS IT FROM BITING OFF MORE THAN IT CAN CHEW

0 comentarios

Between alerting us to danger and allowing us to spot prey, vision keeps many animals, including humans, alive. But exactly how does this important sense work, and why is it easier for us to spot movement of small objects in our field of vision, than to notice other things? The complexity of the neural network that supports vision has long baffled scientists.

Now, with a new technology and support from the National Science Foundation, Claire Wyart in Ehud Isacoff's lab at the University of California at Berkeley and Filo Del Bene at Herwig Baier's lab at the University of California at San Francisco have been able to follow entire populations of retinal and brain cells in their test animal: the zebrafish larva, and solve some of the mysteries of its neural circuit that underlies its vision.

The research team's findings were published in the October 29 issue of Science.

Using a newly developed genetically encoded fluorescent reporter of neural activity developed by Loren Looger at the Howard Hughes Medical Institute's Janelia Farm Research Campus, Wyart and Del Bene have been able to follow how large and small visual cues translate into electrical activity in a region of the zebrafish's brain.

The brain region of the zebrafish that receives input from the retina, called the optic tectum, is separated into layers. The top layer receives direct connections from retinal cells, and has a population of both excitatory and inhibitory neurons. These neurons connect to output neurons that project to other brain regions that control how the zebrafish chases prey.

Isacoff, Baier, Wyart and Del Bene have revealed that a large visual stimulus covering the entire field of vision (such as large floating debris, or another zebrafish) results in low output neuron activity. However small (prey-sized) items moving across the zebrafish's field of vision at a prey-like speed activate the output neurons very well. The basis of this "filtering" of information is that large visual stimuli massively activate the inhibitory cell population and inhibit the output cells, while small moving objects activate only a small number the inhibitory tectal cells, enabling the excitation to drive the output cells efficiently.

This mechanism gives the zebrafish good hunting responses to appropriate visual cues, and thereby helps keep it from biting off more than it can chew.

Isacoff and Baier demonstrated that the inhibition of neural activity by large visual stimuli is essential for hunting prey--as evidenced by the fact that prey capture was disrupted when the inhibitory cells were removed or prevented from emitting neurotransmitters.

(Photo: Zina Deretsky, National Science Foundation)

The National Science Foundation

TYPISTS' ERRORS AND INTENTION THEORIES

0 comentarios

New research published in the journal Science says people think about things they think they don't think about. Vanderbilt University psychologists Gordon Logan and Matthew Crump say when highly skilled people such as surgeons, carpenters, or pilots perform actions without thinking, those actions are highly controlled. The finding adds key information to a debate on whether people consciously perform actions in which they are highly skilled.

Previous research suggests conscious control of driving a car for an experienced driver, for example, is an illusion. These studies point to cases in which people intend to do one thing but do another. But the Vanderbilt research, involving a simple typing test, shows otherwise.

"Our research shows a very tight coupling between intention and action that suggests conscious control is not an illusion," said Logan, Vanderbilt's Centennial Professor of Psychology. "In highly-skilled activities like typing, intention and action fit together very tightly."

The National Science Foundation funded the research, "Cognitive Illusions of Authorship Reveal Hierarchical Error Detection in Skilled Typists," through its Division of Behavioral and Cognitive Sciences in its directorate for social, behavioral and economic sciences.

Logan and Crump tested 72 college-age typists each of whom had about 12 years of typing experience and typed at speeds comparable to professional typists. In three experiments, these skilled typists typed single words shown to them one at a time on a computer screen and their responses appeared on the screen below the word to be typed.

The researchers then secretly introduced errors to see if the typists would detect them. In some instances, they secretly corrected errors made by the typists. In both cases, responses were gauged by measuring the speed at which words were typed.

Logan and Crump found the typists' fingers did not slow down after an error was secretly inserted even though the typists thought they made the error. But when typists made errors, their fingers slowed down whether researchers corrected those errors or not.

The researchers concluded this happens because there are two different processes that create and detect errors. "The 'outer loop' or thinking part of the process tries to decide whether the 'inner loop' or doing part of the process is right or wrong," said Logan.

He contends the 'inner loop' or doing part of the process looks at the hands, fingers and the feel of the keyboard to decide whether the action is correct.

"The illusion of authorship was the most surprising thing," said Logan. "People thought they typed correctly if the screen looked right and they thought they typed incorrectly if the screen looked wrong even though their fingers 'knew' the truth."

This "knowing of the truth" proves that skills people perform without thinking are highly controlled. Logan and Crump argue that control is hierarchical. That part of a person that does the thinking relies on different feedback than the part that does the doing. But the two kinds of feedback together allow people to consciously achieve tremendous degrees of precision and speed.

Logan and Crump dismiss an alternative possibility that there is only one process that detects errors. They contend that if there was only one process, then conscious reports should match what the fingers do.

Typists should have slow down whenever they reported an error and typed at full speed whenever they reported a correct response. Instead, the researchers discovered a mismatch between conscious reports and behavior, suggesting two error detection processes.

"What's cool about our research is that we show there are two error detection processes: an outer loop that supports conscious reports and an inner loop process that slows keystrokes after errors," said Logan. "Typing slows down after corrected errors just like it slows down after actual errors. It maintains the same speed after inserted errors as after correct responses, as if nothing was wrong."

These finger movements, according to project researchers, show people consciously control actions in which they are highly skilled even when they don't think about them.

(Photo: © Jupiter Images 2010)

The National Science Foundation

NEW EVIDENCE SUPPORTS SNOWBALL EARTH AS TRIGGER FOR EARLY ANIMAL EVOLUTION

0 comentarios

Biogeochemists have found new evidence linking "Snowball Earth" glacial events to the rise of early animals. The research was funded by the National Science Foundation (NSF).

Study results appear in the journal Nature.

The controversial Snowball Earth hypothesis posits that, on several occasions, the Earth was covered from pole to pole by a thick sheet of ice lasting for millions of years.

These glaciations, the most severe in Earth history, occurred from 750 to 580 million years ago.

In the aftermath, the researchers discovered, the oceans were rich in phosphorus, a nutrient that controls the abundance of life in the oceans.

The team, led by scientists at the University of California Riverside, tracked phosphorus concentrations through Earth's history by analyzing the composition of iron-rich chemical precipitates. These precipitates accumulated on the seafloor and "scavenged" phosphorus from seawater.

The analyses revealed that there was a pronounced spike in marine phosphorus levels in the mid-Neoproterozoic (from ~750 to ~635 million years ago).

To explain these anomalously high concentrations, the researchers argue that the increase in erosion and chemical weathering on land that accompanied Snowball Earth glacial events led to the high amounts of phosphorus in the ocean.

The abundance of this nutrient, which is essential for life, in turn led to a spike in oxygen production via photosynthesis, according to Enriqueta Barrera, program director in the NSF's Division of Earth Sciences, which funded the research.

The subsequent accumulation of oxygen in the atmosphere led to the emergence of complex life on Earth.

"In the geologic record, we found a signature for high marine phosphorus concentrations appearing in the immediate aftermath of Snowball Earth glacial events," said Noah Planavsky, the first author of the research paper, and a graduate student at UC Riverside.

"Phosphorus ultimately limits net primary productivity on geological timescales," said Plavansky. "High marine phosphorus levels would have facilitated a shift to a more oxygen-rich ocean-atmosphere system. This shift could have paved the way for the rise of animals and their ecological diversification."

Planavsky explained the link between marine phosphorus concentrations and the levels of oxygen in the atmosphere.

"High phosphorus levels would have increased biological productivity in the ocean and the associated production of oxygen by photosynthesis," he said.

Much of this organic matter is consumed, in turn, as a result of respiration reactions that also consume oxygen. However, the burial of some proportion of the organic matter results in a net increase of oxygen levels in the atmosphere.

Until now, scientists believed that geochemical conditions in the iron-rich ocean would have led to low phosphorus concentrations.

The researchers discovered no evidence of a phosphorus crisis after Snowball Earth glacial events, however, finding instead indications of an abundance of phosphorus.

"There are several known chemical fingerprints for increasing oxygen in the ocean and, by inference, in the atmosphere during the middle part of Neoproterozoic, and the rise of animals is an expected consequence," said Timothy Lyons, a biogeochemist at UC Riverside and the senior investigator in the study.

"These results may be the first to capture the nutrient driver that was behind this major step in the history of life. That driver was ultimately tied to the extreme climate of the period."

The scientists present data from some 700 individual samples of iron-oxide-rich rocks, which include new results as well as those obtained from a comprehensive survey of the literature.

(Photo: Lyons Lab, UC-Riverside)

National Science Foundation

EARS TUNED TO WATER

0 comentarios

For bats any smooth, horizontal surface is water. Even so if vision, olfaction or touch tells them it is actually a metal, plastic or wooden plate. Bats therefore rely more on their ears than on any other sensory system. This is due to how smooth surfaces reflect the echolocation calls of bats: they act just like mirrors. In nature there are no other extended, smooth surfaces, so these mirror properties prove to be a reliable feature for recognition of water surfaces. Scientists from the Max Planck Institute for Ornithology in Seewiesen investigated this phenomenon in 15 different species from three big bat families and found that all tried to drink from smooth plates. In addition they found that this acoustic recognition of water is innate.

Water is important for bats to get a drink. However many species also use rivers, lakes or ponds for foraging as water insects are soft and easily digestible. In addition prey is easily detectable with echolocation as the water surface acts like a mirror, reflecting the calls back almost completely. Only if there is an insect on the surface, it reflects back an echo.

In their study Stefan Greif and Björn Siemers from the Max Planck Institute for Ornithology simulated water surfaces in a large flight room and offered the bats a smooth and a structured plate each from either metal, wood or plastic. In weak red illumination the researchers observed whether the bats would fall for this trick and try to drink from the smooth plate. They could hardly believe what they saw: "The Schreiber’s bat tried to drink up to a hundred times in ten minutes from the smooth plate", says Stefan Greif. Three different species - the greater mouse-eared bat, the Daubenton’s bat and the greater horseshoe bat - showed the same results on all three materials. Only from the wooden plates some bats tried to drink a little less. To test how widespread this behaviour is, the scientists tested 11 additional species with one individual each on the metal plate - likewise with a positive result. At least with the insect eating bats this behaviour thus seems to be wide spread.

The researchers were astonished that the animals did not learn that these artificial, acoustic mirrors are not water surfaces. They observed bats that accidentally landed on the smooth plate, took off again and after a few rounds flying resumed their drinking attempts. Even when the scientists placed the plate on a garden table, the bats flew partly underneath the table and then tried to drink, although this certainly is not a natural situation for a pond.

The association of a smooth, horizontal surface with water seems to be hardwired in the bat’s brain. Nevertheless, how do they process the contradictory information coming from other sensory systems? Only in the world of echolocation the metal plate corresponds to water, other sensory systems like vision, olfaction and touch surely tell the bat otherwise. The researchers repeated their experiment in darkness, thereby eliminating the input of vision. The result: the number of drinking attempts increased from 100 to 160 in ten minutes. "So it seems like the bats integrate and weigh up their sensory information, but echolocation dominates all the others", explains Stefan Greif.

Finally the scientists wanted to know if the acoustic information on water is fixed already in the animals’ genes. They repeated the experiment with juveniles who had never seen a lake or a river before. Flightless juveniles were captured in a cave together with their mothers and were raised until they were able to fly. These young bats likewise tried to drink on first contact in their life with a smooth surface. The behaviour therefore seems to be not learned but innate.

In nature, all smooth horizontal surfaces might be bodies of water, but what about all those man-made smooth surfaces like skylights, car roofs or winter gardens? If bats so persistently take horizontal mirrors for water, do they also try to drink from these artificial surfaces until exhausted? This question remains so far unanswered. "We think that bats in nature have other possibilities. They show high site fidelity and probably have their established water surfaces. Maybe they try new surfaces, but eventually they will move on", speculates Stefan Greif. Future studies are needed to evaluate the occurrence, extent and potential ecological consequences of such a scenario.

(Photo: Dietmar Nill, MPI f. Ornithology)

Max Planck Institute

LAWS OF ATTRACTION

0 comentarios

Ocean micro-organisms are shown to behave like larger animals in the presence of sulfur. Might this offer clues about the roles they play in regulating Earth’s climate?

Scientists have sought to learn more about how the Earth’s oceans absorb carbon dioxide and generally exchange gases with the atmosphere so they can better understand the corresponding effects on climate. To that end, many researchers are turning their attention to the microscopic organisms that help recycle carbon, nitrogen, sulfur and other elements through the oceans. Finding out exactly how and to what degree they do that is an ongoing scientific challenge, and scientists may first have to learn more about how the microbes interact with their environment at the scale of the individual microbe.

In recent work, an international team of scientists led by Professor Roman Stocker of the MIT Department of Civil and Environmental Engineering opened a window into that microbial world. The team studied how certain strains of marine microbes find and use sulfur, an element vital to many of them. Some microbes ingest the sulfur, convert it and pass it back into the ocean in altered form, keeping the chemical moving through Earth’s sulfur cycle.

Using video microscopy, the scientists captured digital images of the single-celled microbes swimming toward two forms of sulfur: dimethylsulfide (DMS), the chemical responsible for the slightly sulfuric smell of the sea, and its precursor dimethylsulfoniopropionate (DMSP), which can be converted to DMS by the microbes. DMS is known to influence climate; when it moves from the ocean to the atmosphere as a gas, it oxidizes, forming cloud condensation nuclei which promote cloud formation over the ocean. These clouds reflect sunlight rather than allowing it to heat the Earth’s surface.

Stocker, Justin Seymour, a former postdoctoral fellow at MIT who is now a research fellow at the University of Technology Sydney, Professor Rafel Simó of the Institute for Ocean Sciences in Barcelona, and MIT graduate student Tanvir Ahmed reported this research — which was funded by the Australian Research Council, the Spanish Ministry of Science and Innovation, La Cambra de Barcelona, the Hayashi Fund at MIT, and the National Science Foundation — earlier this year in the journal Science.

“It had been previously demonstrated that DMSP and DMS draw coral reef fish, sea birds, sea urchins, penguins and seals, suggesting that these chemicals play a prominent ecological role in the ocean. Now we know that they also attract microbes,” said Stocker. “But this is not simply adding a few more organisms to that list. The billions of microbes in each liter of seawater play a more important role in the ocean’s chemical cycles than any of the larger organisms.”

Stocker has pioneered the use of microfluidic technology to study the behavior of marine microbes in the laboratory. He re-creates a microcosm of the ocean environment using a device about the size of a flash drive, made of clear rubbery material engraved with minuscule channels into which he injects ocean water, microbes and food in the form of dissolved organic matter. Then, using a camera attached to a microscope, he records the microbes’ response. In the past few years, he has recorded microbes as they use their whip-like flagella to swim toward food, a finding that contradicts the traditional view of marine microbes as passive feeders.

In the latest research, the scientists injected different chemicals into the channels of the device in a way that mimicked the bursting of a microbial cell after a viral infection — a common event in the ocean. Although they performed the tests using several substances, including DMS, the scientists focused primarily on DMSP, which is produced by some phytoplankton and released into the water when a cell explodes. That DMSP can dissolve in the water or be transformed by other microbes into DMS, which also dissolves in the water before being released as a gas into the atmosphere.

The research indicates that the chemical’s odor does draw microbial predators, much as its smelly cousin DMS does at larger scales. This is the first such study to make a visual record of microbial behavior in the presence of DMSP.

The team selected seven microbial species that are roughly analogous to plants, herbivores and predators in the animal kingdom: three photosynthetic microbes (phytoplankton), two heterotrophic bacteria that feed off the carbon produced by other microbes, and two microzooplankton that prey on other microbes.

Six of the seven microbial species tested were attracted to the DMSP in the microfluidic device; only one species — a phytoplankton — ignored it. Some of the species displayed the strongest swimming responses among any of the 100 or so cases yet tested by Stocker and Seymour in their research projects. This, Stocker said, is a clear indication that DMSP acts as a powerful chemical cue.

The researchers also found that some marine microbes, including bacteria, are attracted to DMSP because they feed on it, while others, the microzooplankton, are drawn to the chemical because it signals the presence of prey. This challenges previous theories that DMSP might deter predators. “Our observations clearly show that, for some plankton, DMSP acts as an attractant towards prey rather than a deterrent,” said Simó.

Farooq Azam of the Scripps Institution of Oceanography, one of the first scientists to recognize the importance of microbes in the ocean food chain, agrees. “The findings of this study are exciting and unexpected in showing how broadly distributed throughout the microbial food web is the ability to sense DMSP and to behaviorally respond to it. In view of the significance of DMSP and DMS in global climate, these results should stimulate future research to understand how the potentially complex microbial interactions are reflected in the regulation of the fluxes of DMS and DMSP.”

Azam also said that the study “adds substantial weight to the emergent view that understanding how microbes control the grand cycles of elements in the ocean and global climate” will require study at the scale of the individual microbe.

The researchers are now working on a system to replicate their experiments on oceanographic ships using bacteria collected directly from the ocean, rather than lab-cultured microbes. This will allow them to use microfluidics to create a virtual microbe aquarium at sea.

“We’re doing for microbes what ecologists have done with larger organisms for a long time,” said Stocker. “We’re observing them in order to better understand their behavior.”

(Photo: David Patterson/micro*scope)

MIT

STONE AGE HUMANS NEEDED MORE BRAIN POWER TO MAKE BIG LEAP IN TOOL DESIGN

0 comentarios

Stone Age humans were only able to develop relatively advanced tools after their brains evolved a greater capacity for complex thought, according to a new study that investigates why it took early humans almost two million years to move from razor-sharp stones to a hand-held stone axe.

Researchers used computer modelling and tiny sensors embedded in gloves to assess the complex hand skills that early humans needed in order to make two types of tools during the Lower Palaeolithic period, which began around 2.5 million years ago. The cross-disciplinary team, involving researchers from Imperial College London, employed a craftsperson called a flintnapper to faithfully replicate ancient tool-making techniques.

The team say that comparing the manufacturing techniques used for both Stone Age tools provides evidence of how the human brain and human behaviour evolved during the Lower Palaeolithic period.

Neuroscientist Dr Aldo Faisal, the lead author of the study from the Departments of Bioengineering and Computing at Imperial College London, says: “The advance from crude stone tools to elegant hand-held axes was a massive technological leap for our early human ancestors. Hand-held axes were a more useful tool for defence, hunting and routine work. Interestingly, our study reinforces the idea that tool making and language evolved together as both required more complex thought, making the end of the Lower Palaeolithic a pivotal time in our history. After this period, early humans left Africa and began to colonise other parts of the world.”

Prior to today’s study, researchers have had different theories about why it took early humans more than 2 million years to develop stone axes. Some have suggested that early humans may have had underdeveloped motor skills or abilities, while others have suggested that it took human brains this time to develop more complex thoughts, in order to dream up better tool designs or think about better manufacturing techniques.

The researchers behind today’s study say that their evidence, from studying both tool-making techniques, confirms that the evolution of the early human brain was behind the development of the hand-held axe. Furthermore, the team suggest that the advancement of hand-held axe production may have also coincided with the development of language, as these functions overlap in the same regions of the modern and early human brains.

The flintnapper who participated in today’s study created two types tools including the razor-sharp flakes and hand-held axes. He wore a data glove with sensors enmeshed into its fabric to record hand and arm movements during the production of these tools.

After analysing this data, the researchers discovered that both flake and hand-held axe manufacturing techniques were equally complex, requiring the same kind of hand and arm dexterity. This enabled the scientists to rule out motor skills as the principal factor for holding up stone tool development.

The team deduced from their results that the axe-tool required a high level of brain processing in overlapping areas of the brain that are responsible for a range of different functions including vocal cords and complex hand gestures.

This is the first time that neuroscientists, archaeologists, anthropologists and flintnappers have teamed together, using cutting edge technology including data glove sensors and advanced modelling, to develop a deeper understanding of early human evolution.

In the future, the team plan to use their technology to compare tools made by Neanderthals, an extinct ancestor of humans, to glean insights into their brain development.

(Photo: ICL)

Imperial College London

STUDY OF BABIES' BRAIN SCANS SHEDS NEW LIGHT ON THE BRAIN'S UNCONSCIOUS ACTIVITY AND HOW IT DEVELOPS

0 comentarios

Full-term babies are born with a key collection of networks already formed in their brains, according to new research that challenges some previous theories about the brain’s activity and how the brain develops. The study is published in the journal Proceedings of the National Academy of Sciences (PNAS).

Researchers led by a team from the MRC Clinical Sciences Centre at Imperial College London used functional MRI scanning to look at ‘resting state’ networks in the brains of 70 babies, born at between 29 and 43 weeks of development, who were receiving treatment at Imperial College Healthcare NHS Trust.

Resting state networks are connected systems of neurons in the brain that are constantly active, even when a person is not focusing on a particular task, or during sleep. The researchers found that these networks were at an adult-equivalent level by the time the babies reached the normal time of birth.

One particular resting state network identified in the babies, called the default mode network, has been thought to be involved in introspection and daydreaming. MRI scans have shown that the default mode network is highly active if a person is not carrying out a defined task, but is much less active while consciously performing tasks.

Earlier research had suggested that the default mode network was not properly formed in babies and that it developed during early childhood. The fact that the default mode network has been found fully formed in newborns means it may provide the foundation for conscious introspection, but it cannot be only thing involved, say the researchers behind today’s study.

Professor David Edwards, lead author of the study from the MRC Clinical Sciences Centre at Imperial College London, said: “Some researchers have said that the default mode network is responsible for introspection - retrieving autobiographical memories and envisioning the future, etc. The fact that we found it in newborn babies suggests that either being a fetus is a lot more fun than any of us can remember - lying there happily introspecting and thinking about the future - or that this theory is mistaken.

“Our study shows that babies’ brains are more fully formed than we thought. More generally, we sometimes expect to be able to explain the activity we can see on brain scans terms of someone thinking or doing some task. However, most of the brain is probably engaged in activities of which we are completely unaware, and it is this complex background activity that we are detecting.”

The researchers found that the resting state networks mainly develop after 30 weeks – in the third trimester - and are largely complete by 40 weeks when most babies are born. They reached their conclusions after carrying out functional MRI scans on 70 babies, born at between 29 and 43 weeks of development, who were receiving treatment at Imperial College Healthcare NHS Trust and whose parents had given consent for them to be involved in the study. Some of the babies scanned were under sedation and others were not, but the researchers found no difference in results between sedated and non-sedated babies.

The researchers used a 4-dimensional brain atlas developed with scientists in the Department of Computing at Imperial College London to map the activity that they found in the babies’ brains against what is known about the location of different brain networks.

The next step for this research is to find out how these networks are affected by illnesses and to see if they can be used to diagnose problems.

Today’s research involved collaboration between researchers at Imperial College London and clinicians at Imperial College Healthcare NHS Trust, as part of the Academic Health Science Centre (AHSC), a unique kind of partnership between the College and the Trust, formed in October 2007. The AHSC's aim is to improve the quality of life of patients and populations by taking new discoveries and translating them into new therapies as quickly as possible.

(Photo: ICL)

Imperial College London

MEN AND WOMEN LOSE BONE STRENGTH AS THEY AGE, BUT FOR DIFFERENT REASONS

0 comentarios

Everyone loses bone strength as they get older, but the structural changes at work appear to differ between men and women, according to studies published in the journals Bone and the Journal of Bone and Mineral Research.

“This could explain why more women experience fractures because of osteoporosis,” says Dr. Steven Boyd, biomedical engineer with the University of Calgary’s Schulich School of Engineering and researcher with the McCaig Institute for Bone and Joint Health at the Faculty of Medicine. He is a senior scholar supported by Alberta Innovates – Health Solutions.

Osteoporosis is a bone disease that involves the deterioration of bone tissue, leading to bone fragility and risk of fractures. According to Osteoporosis Canada, the condition affects two million Canadians and often causes disfigurement and reduction or loss of mobility.

Researchers at the University of Calgary used high-resolution, three-dimensional imaging equipment to measure bone at the wrist and lower leg in healthy volunteers aged 16 to 35 and the Calgary participants in the Canadian Multicentre Osteoporosis Study (CaMOS). They used these images to examine the changes in bone structure that occur in men and women as they age. By using a mechanical engineering computer modeling method called finite element analysis, researchers were able to predict the changes in bone strength that will occur over time.

Bone health and strength is typically determined by a measurement called bone mineral density. But studying the internal structure of bone is just as important.

“From an engineering perspective, the micro-architecture of bone – how it’s structured and formed – is a good indication of strength,” says Dr. Boyd. “It’s like having two houses that contain the same number of bricks. They can have different strengths depending on how those bricks are arranged.”

Researchers believe studying the micro-architecture of bone offers valuable clues when it comes to predicting the onset of osteoporosis and developing better treatments.

“This study has provided the basis of important advances in our understanding of how bone weakens with aging,” says Dr. David Hanley, a professor in the Departments of Medicine, Community Health Sciences and Oncology at the University of Calgary. Dr. Hanley is also the Medical Director of Calgary's Osteoporosis and Metabolic Bone Disease Centre.

“We have also used this imaging equipment and the expertise of Dr. Boyd's group to study the bone structural and strength response to exciting new treatments of osteoporosis that are being tested in our clinical trials centre.”

(Photo: Riley Brandt)

University of Calgary

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com