Wednesday, February 24, 2010

STUDENTS FIND LOST OFFICE GEAR WITH TINY SENSORS

0 comentarios

Miniature sensors being developed by CSIRO promise to provide the answers to questions which seem to arise regularly in modern office workplaces like: “Where’s my pen?” and; “Who nicked my stapler?”

CSIRO is developing FLECK™ Nano – a miniature version of the highly successful FLECK sensor nodes that independently record environmental conditions then cooperate with each other to wirelessly send their data to a collection point.

Two students working with CSIRO's vacation scholarship scheme have been applying their research skills to bringing FLECK Nanos indoors. Doing so means things like temperature and power use can be monitored at a very refined level and small objects can be tracked unobtrusively.

“The idea of pervasive computing has been touted for some time, but is not yet available for everyday office items,” CSIRO ICT Centre researcher, Phil Valencia, says.

“We’re aiming to enable a level of ubiquitous sensing that hasn’t been experienced yet and see how it impacts on day-to-day office activities.”

Two university students have spent their summer holidays working with Mr Valencia as part of CSIRO’s vacation scholarship scheme.

A software engineering student at the Australian National University, David Kooymans, is working on reducing the energy demands of mobile FLECK Nanos.

“They communicate with a node in a known location using radio waves,” Mr Kooymans says.

“The more frequently location information is updated the more useful the other data becomes, but the transmitters consume a high proportion of energy so there’s a trade-off to be negotiated there.”

An electrical engineering student at the University of Queensland, Blake Newman, is looking for ways FLECK Nanos could ‘scavenge’ energy from the environment.

“You don’t want to be changing batteries in thousands of little devices so we are designing energy scavenging circuitry that will make power from whatever source it can,” Mr Valencia says.

“If a device doesn’t need much power, it’s amazing how much energy is all around just waiting to be tapped. For example, a FLECK Nano attached to a stapler on a desk in a windowless office is able to function if there is enough light to work by.”

(Photo: Samuel Klistorner, CSIRO)

The Commonwealth Scientific and Industrial Research Organisation

CLOTHING SOLUTION FOR CHILLY OPERATING ROOM ENVIRONMENT

0 comentarios

Hugging heated IV bags, layering undergarments and wrapping themselves in blankets - Barry Finegan and his co-workers do what they can to get warm before heading into the surgical theatre.

Hospital operating rooms are traditionally kept chilly, well below standard room temperature, for the comfort of surgeons sweating under the warm lights needed for their work. But that can leave others on the surgical team literally out in the cold.

Such is the plight for support staff in operating rooms everywhere, and Finegan is hoping bright minds at the University of Alberta can solve the dilemma.

Finegan, a professor in the U of A's Department of Anesthesiology and Pain Medicine and an OR team member at the University of Alberta Hospital, was tired of seeing his colleagues shivering as they cared for patients undergoing complex and often lengthy surgeries, and wanted to do something about it. He didn't have far to go.

"I thought, 'we have the expertise here at the university in clothing materials and design. We must be able to improve the technology we are using in our OR garments.'"

His quest turned out to be a great fit for the Department of Human Ecology in the Faculty of Agricultural, Life and Environmental Sciences. Under the guidance of human ecology professors Megan Strickfaden and Rachel McQueen, third-year students Annette Aslund and Alex Pedden are working with master's student Yin Xu to design a garment that will keep OR team members like Finegan more comfortable during surgery.

The trio is working on an improved design for a warm-up jacket that members of surgical teams can wear in the operating room.

"We have a challenging environment in the OR, in that we've got to keep the patient warm but at the same time, we have to make the environment appropriate for the people who work there," Finegan said.

While surgeons and other staff directly involved in an operation are kept warm by their scrubs and the lights over the table, those on the periphery are vulnerable to the cool temperatures that help keep the surgeons comfortable and focused.

Anesthesiologists like Finegan can sit through surgeries that can go on for up to 12 hours while they continuously monitor their patients, which often also means sitting under a vent that pushes cool air into the room. And nurses have to remove their current warm-up jackets to avoid contamination when preparing patients for surgery.

"The people who aren't patients or directly involved in the surgical process are in an uncomfortably cold environment," Finegan noted. And because the room must be kept sterile, homegrown solutions like sweaters or other garments aren't viable.

Aslund, Pedden and Xu conducted field research at the U of A hospital, monitoring room temperature and humidity in the surgical suites, observing medical teams at work and collecting textile samples from existing operating room garb. They also videotaped and photographed the donning and removal of the garments and, using body diagrams, had the staff indicate where they felt hot or cold.

High on the wish list for both groups were garments that were sterile, professional-looking, thermally functional, washable, that fit well and would be approved for use by Alberta Health Services. Nurses wanted less baggy sleeves, anesthesiologists wanted vests and both groups want multiple pockets.

Using their data, the students drew up a prototype design that will be the subject of a pilot study. Securing grants to continue the research is next, Strickfaden said. "We need to do more textile analysis and build on the early concepts we have for a garment design."

The project provided an opportunity not only for Strickfaden and McQueen to collaborate within their department, but also gave their team a chance to view a problem holistically and work directly with those affected-the ideal approach in human ecology. It was an "eye-opener," especially for the undergraduate students, Strickfaden added.

"The OR was a complex environment and they really got to experience a research situation, interviewing people and getting out in the field."

Aslund is studying for a science degree in clothing and textiles, with a minor in product development. She got "a real sense of satisfaction" from her field research, which involved interviewing members of the OR team. "It felt more university-like than sitting in a classroom. And we just scratched the surface of it. I would have liked to go on to the textile testing. But I did gain some knowledge about taking a holistic approach to solving a problem."

She believes her early foray into field work will serve her well, no matter what her future career brings. "If I have to do a focus group in marketing or business I'll be able to talk to a client or conduct different kinds of research."

The experience has been just as rewarding for Finegan, who looks forward to the final outcome.

"I was impressed by the detailed approach the students and the department took to assessing our environment. For me as a medical researcher, it's always illuminating to work with those in other disciplines and realize the strength of cross-disciplinary research. We forget sometimes the importance of human ecology in ensuring that the environment in which we work is optimal. Obviously for us, temperature issues are potential distractions."

(Photo: U. Alberta)

University of Alberta

SOME MORBIDLY OBESE PEOPLE ARE MISSING GENES

0 comentarios

A small but significant proportion of morbidly obese people are missing a section of their DNA, according to research published today in Nature. The authors of the study, from Imperial College London and ten other European Centres, say that missing DNA such as that identified in this research may be having a dramatic effect on some people's weight.

According to the new findings, around seven in every thousand morbidly obese people are missing a part of their DNA, containing approximately 30 genes. The researchers did not find this kind of genetic variation in any normal weight people.

There are an estimated 700,000 morbidly obese people in England, with a Body Mass Index (BMI) of over 40. Researchers believe that the weight problems of around one in twenty morbidly obese people are due to known genetic variations, including mutations and missing DNA. Many more similar obesity-causing mutations, such as the one in this study, remain to be found, says the team.

Previous research had identified several genetic variations that contribute to obesity, most of which are single mutations in a person's DNA that change the function of a gene. Today's research is the first to clearly demonstrate that obesity in otherwise physically healthy individuals can be caused by a rare genetic variation in which a section of a person's DNA is missing. The researchers do not yet know the function of the missing genes, but previous research has suggested that some of them may be associated with delayed development, autism and schizophrenia.

People inherit two copies of their DNA, one from their mother and one from their father. Sometimes, missing one copy of one or several genes - as in the individuals identified in this latest study - can have a drastic effect on the body.

The researchers believe there may be other genetic deletions, in addition to those identified today, that increase a person's risk of becoming obese. They hope that by identifying genetic variations causing people to be extremely obese, they can develop genetic tests to help determine the best course of treatment for these individuals.

Professor Philippe Froguel, lead author of the study from the School of Public Health at Imperial College London, said: "Although the recent rise in obesity in the developed world is down to an unhealthy environment, with an abundance of unhealthy food and many people taking very little exercise, the difference in the way people respond to this environment is often genetic. It is becoming increasingly clear that for some morbidly obese people, their weight gain has an underlying genetic cause. If we can identify these individuals through genetic testing, we can then offer them appropriate support and medical interventions, such as the option of weight loss surgery, to improve their long-term health."

The Imperial team first identified the missing genes in teenagers and adults who had learning difficulties or delayed development. They found 31 people who had nearly identical 'deletions' in one copy of their DNA. All of the adults with this genetic change had a BMI of over 30, which means they were obese.

The researchers then went on to study the genomes of 16,053 people who were either obese or normal weight, (with a BMI between 18.5 and 25), from eight European cohorts. They identified 19 more people with the same genetic deletion, all of whom were severely obese, but did not find the deletion in any healthy normal weight people. This means the genetic deletion was found in seven in every 1,000 morbidly obese people, making it the second most frequent known genetic cause of obesity.

People with the deletion tended to be normal weight toddlers, becoming overweight during childhood and then severely obese as adults. The researchers also looked at the genomes of their parents, and found that 11 people inherited the deletion from their mother and four from their father, with ten of the deletions occurring by chance. All the parents with the deletion were also obese.

The next step in this research will be to determine the function of the missing genes. Previous studies have suggested that some of the genes may be associated with delayed development, autism and schizophrenia, so the researchers also plan to investigate the possible links between these conditions and obesity.

According to first author Dr Robin Walters, from the School of Public Health at Imperial College London, there are likely to be many more variations like the deletion identified in this study that remain to be found. He said: "Although individually rare, the combined effect of several variations of this type could explain much of the genetic risk for severe obesity, which is known to run in families. Previously identified genetic influences on weight gain have a much less drastic effect - increasing weight by just one or two pounds, for example. By looking at groups of people with severe obesity, we may be more likely to find these rare genetic variations."

Professor Froguel added: "The method used in the study could also help find novel genetic variations that affect the risk of other conditions. We identified this variant by first studying very obese individuals, and then homing in on the region of interest in larger, less severely affected groups. This powerful approach could be used to identify genetic influences on other diseases that are poorly understood at present, such as Type 2 diabetes."

(Photo: ICL)

Imperial College London

RECORD-BREAKING COLLISIONS

0 comentarios

In December, the Large Hadron Collider, the world’s largest particle accelerator, shattered the world record for highest energy particle collisions.

Recently, team led by researchers from MIT, CERN and the KFKI Research Institute for Particle and Nuclear Physics in Budapest, Hungary, completed work on the first scientific paper analyzing the results of those collisions. Its findings show that the collisions produced an unexpectedly high number of particles called mesons — a factor that will have to be taken into account when physicists start looking for more rarer particles and for the theorized Higgs boson.

“This is the very first step in a long road to performing extremely sensitive analyses that can detect particles produced only in one in a billion collisions,” says Gunther Roland, MIT associate professor of physics and an author of the new paper.

Roland and MIT professors Wit Busza and Boleslaw Wyslouch, who are members of the CMS (compact muon solenoid) collaboration, were among the study leaders. The CMS collaboration runs one of four detectors at the collider.

The Large Hadron Collider (LHC), located underground near Geneva, Switzerland, started its latest run in late November. On Dec. 8, the proton beams around the 17-mile ring collided at a peak energy of 2.36 tera electron volts (TeV), breaking the previous record of 1.96 TeV achieved at the Fermi National Accelerator Lab. Because of Einstein’s equation, E=mc2, which correlates mass and energy, higher energy levels should produce heavier particles — possibly including some never seen before.

In the new paper, submitted to the Journal of High Energy Physics by CMS, the physicists analyzed the number of particles produced in the aftermath of the high-energy collisions. When protons collide, their energy is predominantly transformed into particles called mesons — specifically, two types of mesons known as pions and kaons.

To their surprise, the researchers that the number of those particles increased faster with collision energy than was predicted by their models, which were based on results of lower-energy collisions.

Taking the new findings into account, the team is now tuning its predictions of how many of those mesons will be found during even higher energy collisions. When those high-energy experiments are conducted, it will be critical to know how many such particles to expect so they can be distinguished from more rare particles.

“If we’re looking for rare particles later on, these mesons will be in the background,” says Roland. “These results show us that our expectations were not completely wrong, but we have to modify things a bit.”

Using the Large Hadron Collider, physicists hope to eventually detect the Higgs boson, a particle that is theorized to give all other particles their mass, as well as evidence for other physical phenomena such as supersymmetry, extra dimensions of space and the creation of a new form of matter called quark-gluon plasma (QGP). The new data provide an important reference point when CMS will look for signatures of QGP creation in collisions of lead ions at the LHC later this year.

The CMS team, which includes more than 2,000 scientists around the world, has 45 members (including faculty, students and research scientists) from the MIT Laboratory for Nuclear Science’s Particle Physics Collaboration and heavy-ion research groups.

The Large Hadron Collider is capable of creating collisions up to 14 TeV, but scientists are gradually easing the machine up to that level to try to avoid safety issues that have arisen in the past. In September 2008, the collider had to be shut down for several months after a connector joining two of the collider’s magnets failed, causing an explosion and leakage of the liquid helium that cools the magnets.

During the collider’s next run in March, researchers hope to create collisions of 7 TeV, says Roland. The success of the latest effort “makes us extremely optimistic about the detector,” he says. “It performed beautifully during the run.”

(Photo: CERN)

MIT

PCS AROUND THE WORLD UNITE TO MAP THE MILKY WAY

0 comentarios

At this very moment, tens of thousands of home computers around the world are quietly working together to solve the largest and most basic mysteries of our galaxy.

Enthusiastic and inquisitive volunteers from Africa to Australia are donating the computing power of everything from decade-old desktops to sleek new netbooks to help computer scientists and astronomers at Rensselaer Polytechnic Institute map the shape of our Milky Way galaxy. Now, just this month, the collected computing power of these humble home computers has surpassed one petaflop, a computing speed that surpasses the world’s second fastest supercomputer.

The project, MilkyWay@Home, uses the Berkeley Open Infrastructure for Network Computing (BOINC) platform, which is widely known for the SETI@home project used to search for signs of extraterrestrial life. Today, MilkyWay@Home has outgrown even this famous project, in terms of speed, making it the fastest computing project on the BOINC platform and perhaps the second fastest public distributed computing program ever in operation (just behind Folding@home).

The interdisciplinary team behind MilkyWay@Home, which ranges from professors to undergraduates, began the formal development under the BOINC platform in July 2006 and worked tirelessly to build a volunteer base from the ground up to build its computational power.

Each user participating in the project signs up their computer and offers up a percentage of the machine’s operating power that will be dedicated to calculations related to the project. For the MilkyWay@Home project, this means that each personal computer is using data gathered about a very small section of the galaxy to map its shape, density, and movement.

In particular, computers donating processing power to MilkyWay@Home are looking at how the different dwarf galaxies that make up the larger Milky Way galaxy have been moved and stretched following their merger with the larger galaxy millions of years ago. This is done by studying each dwarf’s stellar stream. Their calculations are providing new details on the overall shape and density of dark matter in the Milky Way galaxy, which is widely unknown.

The galactic computing project had very humble beginnings, according to Heidi Newberg, associate professor of physics, applied physics, and astronomy at Rensselaer. Her personal research to map the three-dimensional distribution of stars and matter in the Milky Way using data from the extensive Sloan Digital Sky Survey could not find the best model to map even a small section of a single galactic star stream in any reasonable amount of time.

“I was a researcher sitting in my office with a very big computational problem to solve and very little personal computational power or time at my fingertips,” Newberg said. “Working with the MilkyWay@Home platform, I now have the opportunity to use a massive computational resource that I simply could not have as a single faculty researcher, working on a single research problem.”

Before taking the research to BOINC, Newberg worked with Malik Magdon-Ismail, associate professor of computer science, to create a stronger and faster algorithm for her project. Together they greatly increased the computational efficiency and set the groundwork for what would become the much larger MilkyWay@Home project.

“Scientists always need additional computing power,” Newberg said. “The massive amounts of data out there make it so that no amount of computational power is ever enough.” Thus, her work quickly exceeded the limits of laboratory computers and the collaboration to create MilkyWay@Home formally began in 2006 with the assistance of the Claire and Roland Schmitt Distinguished Professor of Computer Science Boleslaw Szymanski; Associate Professor of Computer Science Carlos Varela; postdoctoral research assistant Travis Desell; as well as other graduate and undergraduate students at Rensselaer.

With this extensive collaboration, leaps and bounds have been made to further the astrophysical goals of the project, but important discoveries have also been made along the way in computational science to create algorithms that make the extremely distributed and diverse MilkyWay@Home system work so well, even with volunteered computers that can be highly unreliable.

“When you use a supercomputer, all the processors are the same and in the same location, so they are producing the same results at the same time,” Varela said. “With an extremely distributed system, like we have with MilkyWay@Home, we are working with many different operating systems that are located all over the globe. To work with such asynchronous results we developed entirely new algorithms to process work as it arrives in the system.” This makes data from even the slowest of computers still useful to the project, according to Varela. “Even the slowest computer can help if it is working on the correct problem in the search.”

In total, nine articles have been published and multiple public talks have been given regarding the computer science discoveries made during the creation of the project, and many more are expected as the refined algorithms are utilized for other scientific problems. Collaboration has already begun to develop a DNA@Home platform to find gene regulations sites on human DNA. Collaborations have also started with biophysicists and chemists on two other BOINC projects at Rensselaer to understand protein folding and to design new drugs and materials.

In addition to important discoveries in computer science and astronomy, the researchers said the project is also making important strides in efforts to include the public in scientific discovery. Since the project began, more than 45,000 individual users from 169 countries have donated computational power to the effort. Currently, approximately 17,000 users are active in the system.

“This is truly public science,” said Desell, who began working on the project as a graduate student and has seen the project through its entire evolution. “This is a really unique opportunity to get people interested in science while also allowing us to create a strong computing resource for Rensselaer research.” All of the research, results, data, and even source code are made public and regularly updated for volunteers on the main MilkyWay@Home Web site found at: http://MilkyWay.cs.rpi.edu/.

Desell cites the public nature and regular communication as important components of the project’s success. “They are not just sitting back and allowing the computer to do the work,” he says, referencing that volunteers have made donations for equipment as well as made their own improvements to the underlying algorithms that greatly increased computational speed. Varela jokes, “We may end up with a paper with 17,000 authors.”

(Photo: Sloan Digital Sky Survey)

Reensealer Polytechnic Institute

LEAF VEINS INSPIRE A NEW MODEL FOR DISTRIBUTION NETWORKS

0 comentarios
A team of biophysicists at Rockefeller University developed a mathematical model showing that complex sets of interconnecting loops — like the netted veins that transport water in a leaf — provide the best distribution network for supplying fluctuating loads to varying parts of the system. It also shows that such a network can best handle damage. The findings could change the way engineers think about designing networks to handle a variety of challenges like the distribution of water or electricity in a city.

Operations researchers have long believed that the best distribution networks for many scenarios look like trees, with a succession of branches stemming from a central stalk and then branches from those branches and so on, to the desired destinations. But this kind of network is vulnerable: If it is severed at any place, the network is cut in two and cargo will fail to reach any point “downstream” of the break.

By contrast, in the leaves of most complex plants, evolution has devised a system to distribute water that is more supple in at least two key ways. Plants are under constant attack from bugs, diseases, animals and the weather. If a leaf’s distribution network were tree-like and damaged, the part of the leaf downstream of the damage would starve for water and die. In some of the Earth’s more ancient plants, such as the gingko, this is the case (see video, bottom). But many younger, more sophisticated plants have evolved a vein system of interconnected loops that can reroute water around any damage, providing many paths to any given point, as in the lemon leaf (see video, top). Operations researchers have appreciated that these redundancies are an effective hedge against damage. What’s most surprising in the new research, according to Marcelo O. Magnasco, head of the Laboratory of Mathematical Physics at Rockefeller University, is that the complex network also does a better job of handling fluctuating loads according to shifts in demand from different parts of the system — a common real-world need within dynamic distribution networks.

“For decades, people have believed that the tree-like network was optimal for fluctuating demand,” Magnasco says. “These findings could seriously shake things up. People will have to take another look at how they design these kinds of systems.”

In a paper published as the cover story of the January 29 Physical Review Letters, Magnasco, lead researcher Eleni Katifori, a fellow at Rockefeller’s Center for Studies in Physics and Biology, and colleagues lay out a model that assigns a cost to each section of leaf vein proportional to how much water it can carry. They looked for networks that suffered the least strain in the face of two challenges common in both leaves and human-built networks: damage to a randomly chosen segment of the network and changes in the load demanded by different parts of the network. In both scenarios, they found the most robust system was a complex, hierarchical network of nested loops, similar to the fractal-like web of veins that transport water in leaves. This loopy network design is also found in the blood vessels of the retina, the architecture of some corals and the structural veins of insect wings.

Katifori is now extending the research to delve more deeply into how distribution networks handle fluctuating loads, guided by nature’s own solution in the leaf.

“It is tempting to ignore the loops, because the central veins stand out and have a tree-like form,” Katifori says. “But they are all connected, and the loops are right there to see, if you just look at the leaf.”

Rockefeller University

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com