Tuesday, March 1, 2011

Researchers Work Toward Automating Sedation in Intensive Care Units

0 comentarios

Researchers at the Georgia Institute of Technology and the Northeast Georgia Medical Center are one step closer to their goal of automating the management of sedation in hospital intensive care units (ICUs). They have developed control algorithms that use clinical data to accurately determine a patient's level of sedation and can notify medical staff if there is a change in the level.

"ICU nurses have one of the most task-laden jobs in medicine and typically take care of multiple patients at the same time, so if we can use control system technology to automate the task of sedation, patient safety will be enhanced and drug delivery will improve in the ICU," said James Bailey, the chief medical informatics officer at the Northeast Georgia Medical Center in Gainesville, Ga. Bailey is also a certified anesthesiologist and intensive care specialist.

During a presentation at the IEEE Conference on Decision and Control, the researchers reported on their analysis of more than 15,000 clinical measurements from 366 ICU patients they classified as "agitated" or "not agitated." Agitation is a measure of the level of patient sedation. The algorithm returned the same results as the assessment by hospital staff 92 percent of the time.

"Manual sedation control can be tedious, imprecise, time-consuming and sometimes of poor quality, depending on the skills and judgment of the ICU nurse," said Wassim Haddad, a professor in the Georgia Tech School of Aerospace Engineering. "Ultimately, we envision an automated system in which the ICU nurse evaluates the ICU patient, enters the patient's sedation level into a controller, which then adjusts the sedative dosing regimen to maintain sedation at the desired level by continuously collecting and analyzing quantitative clinical data on the patient."

This project is supported in part by the U.S. Army. On the battlefield, military physicians sometimes face demanding critical care situations and the use of advanced control technologies is essential for extending the capabilities of the health care system to handle large numbers of injured soldiers.

Working with Haddad and Bailey on this project are Allen Tannenbaum and Behnood Gholami. Tannenbaum holds a joint appointment as the Julian Hightower Chair in the Georgia Tech School of Electrical and Computer Engineering and the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, while Gholami is currently a postdoctoral fellow in the Georgia Tech School of Electrical and Computer Engineering.

This research builds on Haddad and Bailey's previous work automating anesthesia in hospital operating rooms. The adaptive control algorithms developed by Haddad and Bailey control the infusion of an anesthetic drug agent in order to maintain a desired constant level of depth of anesthesia during surgery in the operating room. Clinical trial results that will be published in the March issue of the journal IEEE Transactions on Control Systems Technology demonstrate excellent regulation of unconsciousness allowing for a safe and effective administration of an anesthetic agent.

Critically ill patients in the ICU frequently require invasive monitoring and other support that can lead to anxiety, agitation and pain. Sedation is essential for the comfort and safety of these patients.

"The challenge in developing closed-loop control systems for sedating critically ill patients is finding the appropriate performance variable or variables that measure the level of sedation of a patient, in turn allowing an automated controller to provide adequate sedation without oversedation," said Gholami.

In the ICU, the researchers used information detailing each patient's facial expression, gross motor movement, response to a potentially noxious stimulus, heart rate and blood pressure stability, noncardiac sympathetic stability, and nonverbal pain scale to determine a level of sedation.

The researchers classified the clinical data for each variable into categories. For example, a patient's facial expression was categorized as "relaxed," "grimacing and moaning," or "grimacing and crying." A patient's noncardiac sympathetic stability was classified as "warm and dry skin," "flushed and sweaty," or "pale and sweaty."

They also recorded each patient's score on the motor activity and assessment scale (MAAS), which is used by clinicians to evaluate level of sedation on a scale of zero to six. In the MAAS system, a score of zero represents an "unresponsive patient," three represents a "calm and cooperative patient," and six represents a "dangerously agitated patient." The MAAS score is subjective and can result in inconsistencies and variability in sedation administration.

Using a Bayesian network, the researchers used the clinical data to compute the probability that a patient was agitated. Twelve-thousand measurements collected from patients admitted to the ICU at the Northeast Georgia Medical Center between during a one-year period were used to train the Bayesian network and the remaining 3,000 were used to test it.

In 18 percent of the test cases, the computer classified a patient as "agitated" but the MAAS score described the same patient as "not agitated." In five percent of the test cases, the computer classified a patient as "not agitated," whereas the MAAS score indicated "agitated." These probabilities signify an 18 percent false-positive rate and a five percent false-negative rate.

"This level of performance would allow a significant reduction in the workload of the intensive care unit nurse, but it would in no way replace the nurse as the ultimate judge of the adequacy of sedation," said Bailey. "However, by relieving the nurse of some of the work associated with titration of sedation, it would allow the nurse to better focus on other aspects of his or her demanding job."

The researchers' next step toward closed-loop control of sedation in the ICU will be to continuously collect clinical data from ICU patients in real time. Future work will involve the development of objective techniques for assessing ICU sedation using movement, facial expression and responsiveness to stimuli.

Digital imaging will be used to assess a patient's facial expression and also gross motor movement. In a study published in the June 2010 issue of the journal IEEE Transactions on Biomedical Engineering, the researchers showed that machine learning methods could be used to assess the level of pain in patients using facial expressions.

"We will explore the relationship between the data we can extract from these multiple sensors and the subjective clinical MAAS score," said Haddad. "We will then use the knowledge we have gained in developing feedback control algorithms for anesthesia dosage levels in the operating room to develop an expert system to automate drug dosage in the ICU."

(Photo: GIT)

Georgia Institute of Technology

Thawing permafrost likely will accelerate global warming in coming decades, says study

0 comentarios
Up to two-thirds of Earth's permafrost likely will disappear by 2200 as a result of warming temperatures, unleashing vast quantities of carbon into the atmosphere, says a new study by the University of Colorado Boulder's Cooperative Institute for Research in Environmental Sciences.

The carbon resides in permanently frozen ground that is beginning to thaw in high latitudes from warming temperatures, which will impact not only the climate but also international strategies to reduce fossil fuel emissions, said CU-Boulder's Kevin Schaefer, lead study author. "If we want to hit a target carbon dioxide concentration, then we have to reduce fossil fuel emissions that much lower than previously thought to account for this additional carbon from the permafrost," he said. "Otherwise we will end up with a warmer Earth than we want."

The escaping carbon comes from plant material, primarily roots trapped and frozen in soil during the last glacial period that ended roughly 12,000 years ago, he said. Schaefer, a research associate at CU-Boulder's National Snow and Ice Data Center, an arm of CIRES, likened the mechanism to storing broccoli in a home freezer. "As long as it stays frozen, it stays stable for many years," he said. "But if you take it out of the freezer it will thaw out and decay."

While other studies have shown carbon has begun to leak out of permafrost in Alaska and Siberia, the study by Schaefer and his colleagues is the first to make actual estimates of future carbon release from permafrost. "This gives us a starting point, and something more solid to work from in future studies," he said. "We now have some estimated numbers and dates to work with."

The new study was published online Feb. 14 in the scientific journal Tellus. Co-authors include CIRES Fellow and Senior Research Scientist Tingjun Zhang from NSIDC, Lori Bruhwiler of the National Oceanic and Atmospheric Administration and Andrew Barrett from NSIDC. Funding for the project came from NASA, NOAA and the National Science Foundation.

Schaefer and his team ran multiple Arctic simulations assuming different rates of temperature increases to forecast how much carbon may be released globally from permafrost in the next two centuries. They estimate a release of roughly 190 billion tons of carbon, most of it in the next 100 years. The team used Intergovernmental Panel on Climate Change scenarios and land-surface models for the study.

"The amount we expect to be released by permafrost is equivalent to half of the amount of carbon released since the dawn of the Industrial Age," said Schaefer. The amount of carbon predicted for release between now and 2200 is about one-fifth of the total amount of carbon in the atmosphere today, according to the study.

While there were about 280 parts per million of CO2 in Earth's atmosphere prior to the Industrial Age beginning about 1820, there are more than 380 parts per million of carbon now in the atmosphere and the figure is rising. The increase, equivalent to about 435 billion tons of carbon, resulted primarily from human activities like the burning of fossil fuels and deforestation.

Using data from all climate simulations, the team estimated that about 30 to 60 percent of Earth's permafrost will disappear by 2200. The study took into account all of the permanently frozen ground at high latitudes around the globe.

The consensus of the vast majority of climate scientists is that the buildup of CO2 and other greenhouse gases in Earth's atmosphere is the primary reason for increasingly warm temperatures on Earth. According to NOAA, 2010 was tied for the hottest year on record. The hottest decade on record occurred from 2000 to 2010.

Greater reductions in fossil fuel emissions to account for carbon released by the permafrost will be a daunting global challenge, Schaefer said. "The problem is getting more and more difficult all the time," he said. "It is hard enough to reduce the emissions in any case, but now we have to reduce emissions even more. We think it is important to get that message out now."

University of Colorado

Biological Anthropologists Question Claims for Human Ancestry

0 comentarios
“Too simple” and “not so fast” suggest biological anthropologists from the George Washington University and New York University about the origins of human ancestry. In the upcoming issue of the journal Nature, the anthropologists question the claims that several prominent fossil discoveries made in the last decade are our human ancestors. Instead, the authors offer a more nuanced explanation of the fossils’ place in the Tree of Life. They conclude that instead of being our ancestors the fossils more likely belong to extinct distant cousins.

“Don’t get me wrong, these are all important finds,” said co-author Bernard Wood, University Professor of Human Origins and professor of Human Evolution Anatomy at GW and director of its Center for the Advanced Study of Hominid Paleobiology. “But to simply assume that anything found in that time range has to be a human ancestor is naïve.”

The paper, “The evolutionary context of the first hominins,” reconsiders the evolutionary relationships of fossils named Orrorin, Sahelanthropus and Ardipithecus, dating from four to seven million years ago, which have been claimed to be the earliest human ancestors. Ardipithecus, commonly known as “Ardi,” was discovered in Ethiopia and was found to be radically different from what many researchers had expected for an early human ancestor. Nonetheless, the scientists who made the discovery were adamant it is a human ancestor.

“We are not saying that these fossils are definitively not early human ancestors,” said co-author Terry Harrison, a professor in NYU’s Department of Anthropology and director of its Center for the Study of Human Origins. “But their status has been presumed rather than adequately demonstrated, and there are a number of alternative interpretations that are possible. We believe that it is just as likely or more likely that they are fossil apes situated close to the ancestry of the living great ape and humans.”

The authors are skeptical about the interpretation of the discoveries and advocate a more nuanced approach to classifying the fossils. Wood and Harrison argue that it is naïve to assume that all fossils are the ancestors of creatures alive today and also note that shared morphology or homoplasy – the same characteristics seen in species of different ancestry – was not taken into account by the scientists who found and described the fossils. For example, the authors claim that for Ardipithecus to be a human ancestor, one must assume that homoplasy does not exist in our lineage, but is common in the lineages closest to ours. The authors suggest there are a number of potential interpretations of these fossils and that being a human ancestor is by no means the simplest, or most parsimonious explanation.

The scientific community has long concluded that the human lineage diverged from that of the chimpanzee six to eight million years ago. It is easy to differentiate between the fossils of a modern-day chimpanzee and a modern human. However, it is more difficult to differentiate between the two species when examining fossils that are closer to their common ancestor, as is the case with Orrorin, Sahelanthropus, and Ardipithecus.

In their paper, Wood and Harrison caution that history has shown how uncritical reliance on a few similarities between fossil apes and humans can lead to incorrect assumptions about evolutionary relationships. They point to the case of Ramapithecus, a species of fossil ape from south Asia, which was mistakenly assumed to be an early human ancestor in the 1960s and 1970s, but later found to be a close relative of the orangutan.

Similarly, Oreopithecus bambolii, a fossil ape from Italy shares many similarities with early human ancestors, including features of the skeleton that suggest that it may have been well adapted for walking on two legs. However, the authors observe, enough is known of its anatomy to show that it is a fossil ape that is only distantly related to humans, and that it acquired many “human-like” features in parallel.

Wood and Harrison point to the small canines in Ardipithecus and Sahelanthropus as possibly the most convincing evidence to support their status as early human ancestors. However, canine reduction was not unique to the human lineage for it occurred independently in several lineages of fossil apes (e.g., Oreopithecus, Ouranopithecus and Gigantopithecus) presumably as a result of similar shifts in dietary behavior.

(Photo: ©iStockPhoto.com/wllad)

New York University

Ground-based lasers vie with satellites to map Earth’s magnetic field

0 comentarios

Mapping the Earth’s magnetic field – to find oil, track storms or probe the planet’s interior – typically requires expensive satellites.

University of California, Berkeley, physicists have now come up with a much cheaper way to measure the Earth’s magnetic field using only a ground-based laser.

The method involves exciting sodium atoms in a layer 90 kilometers (60 miles) above the surface and measuring the light they give off.

“Normally, the laser makes the sodium atom fluoresce,” said Dmitry Budker, UC Berkeley professor of physics. “But if you modulate the laser light, when the modulation frequency matches the spin precession of the sodium atoms, the brightness of the spot changes.”

Because the local magnetic field determines the frequency at which the atoms precess, this allows someone with a ground-based laser to map the magnetic field anywhere on Earth.

Budker and three current and former members of his laboratory, as well as colleagues with the European Southern Observatory (ESO), lay out their technique in a paper appearing online this week in the journal Proceedings of the National Academy of Sciences.

Various satellites, ranging from the Geostationary Operational Environmental Satellites, or GOES, to an upcoming European mission called SWARM, carry instruments to measure the Earth’s magnetic field, providing data to companies searching for oil or minerals, climatologists tracking currents in the atmosphere and oceans, geophysicists studying the planet’s interior and scientists tracking space weather.

Ground-based measurements, however, can avoid several problems associated with satellites, Budker said. Because these spacecraft are moving at high speed, it’s not always possible to tell whether a fluctuation in the magnetic field strength is real or a result of the spacecraft having moved to a new location. Also, metal and electronic instruments aboard the craft can affect magnetic field measurements.

“A ground-based remote sensing system allows you to measure when and where you want and avoids problems of spatial and temporal dependence caused by satellite movement,” he said. “Initially, this is going to be competitive with the best satellite measurements, but it could be improved drastically.”

The idea was sparked by a discussion Budker had with a colleague about of the lasers used by many modern telescopes to remove the twinkle from stars caused by atmospheric disturbance. That technique, called laser guide star adaptive optics, employs lasers to excite sodium atoms deposited in the upper atmosphere by meteorites. Once excited, the atoms fluoresce, emitting light that mimics a real star. Telescopes with such a laser guide star, including the Very Large Telescope in Chile and the Keck telescopes in Hawaii, adjust their “rubber mirrors” to cancel the laser guide star’s jiggle, and thus remove the jiggle for all nearby stars.

It is well known that these sodium atoms are affected by the Earth’s magnetic field. Budker, who specializes in extremely precise magnetic-field measurements, realized that you could easily determine the local magnetic field by exciting the atoms with a pulsed or modulated laser of the type used in guide stars. The method is based on the fact that the electron spin of each sodium atom precesses like a top in the presence of a magnetic field. Hitting the atom with light pulses at just the right frequency will cause the electrons to flip, affecting the way the atoms interact with light.

“It suddenly struck me that what we do in my lab with atomic magnetometers we can do with atoms freely floating in the sky,” he said.

Budker’s former post-doctoral fellow James Higbie ‑ now an assistant professor of physics and astronomy at Bucknell University – conducted laboratory measurements and computer simulations confirming that the effects of a modulated laser could be detected from the ground by a small telescope. He was assisted by Simon M. Rochester, who received his Ph.D. in physics from UC Berkeley last year under Budker’s direction and is now running a start-up consulting company, Rochester Scientific; and current post-doctoral fellow Brian Patton.

In practice, a 20- to 50-watt laser ‑ small enough to load on a truck or boat ‑ tuned to the orange sodium line (589 nanometer wavelength) would shine polarized light into the 10 kilometer-thick (approximately five miles) sodium layer in the mesosphere, which is about 90 kilometers overhead. The frequency with which the laser light is modulated or pulsed would be shifted slightly around this wavelength to stimulate a spin flip.

The decrease or increase in brightness when the modulation is tuned to a “sweet spot” determined by the magnitude of the magnetic field could be as much as 10 percent of the typical fluorescence, Budker said. The spot itself would be too faint to see with the naked eye, but the brightness change could easily be measured by a small telescope.

“This is such a simple idea, I thought somebody must have thought of it before,” Budker said.

He was right. William Happer, a physicist who pioneered spin-polarized spectroscopy and the sodium laser guide stars, had thought of the idea, but had never published it.

“I was very, very happy to hear that, because I felt there may be a flaw in the idea, or that it had already been published,” Budker said.

(Photo: Budker lab)

University of California, Berkeley

Yale Researchers Find Clues to Mystery of Preterm Delivery

0 comentarios

Researchers at Yale School of Medicine have found that excessive formation of calcium crystal deposits in the amniotic fluid may be a reason why some pregnant women suffer preterm premature rupture of the membranes (PPROM) leading to preterm delivery.

This is a key breakthrough in solving the mystery of preterm birth, a leading cause of death and permanent disability in newborns.

Researchers know that infection, maternal stress and placental bleeding can trigger some preterm deliveries, but the cause of many other preterm deliveries remains unknown. In these cases, women experience early contractions, cervical dilation and a torn amniotic sac.

A team of researchers in the Department of Obstetrics, Gynecology & Reproductive Sciences at Yale, including first author Lydia Shook and her mentor Irina Buhimschi, M.D., investigated the idea that calcification-excessive buildup of calcium-of the fetal membranes may lead to PPROM and preterm birth. "We noticed that in many women, analysis of the proteins in their amniotic fluid did not show signs of inflammation, and we could not find any cause for their preterm birth," said Shook, a Yale medical student. "We took a fresh look for what was causing breakdown of the membranes, which can lead to lost elasticity, integrity and eventually rupture."

Scientists know that calcifying nanoparticles are involved in many degenerative conditions including arthritis and atherosclerosis. "These mineral-protein complexes can disrupt normal cellular processes and cause cell death," Shook said. "We wondered whether they could also be responsible for damage to the fetal membranes in pregnant women."

Shook and her co-authors used a stain to look for calcium deposits in placental and fetal membrane tissue from patients with PPROM and preterm birth, as well as full-term deliveries. They used a sterile culture technique to determine whether amniotic fluid can form nanoparticles. They then exposed fetal membranes to the cultured nanoparticles to determine their ability to induce cell dysfunction, damage and cell death.

The team found evidence of calcification of fetal membranes collected from preterm deliveries. Fetuin, one of the major proteins involved in nanoparticle formation, was found in these deposits. Levels of fetuin in amniotic fluid were lower in women who delivered with PPROM compared to those who delivered early with intact membranes.

"This preliminary evidence suggests that amniotic fluid has the potential to form nanoparticles and deposit them in the fetal membranes," said Shook. "Low fetuin may be a biomarker for women at risk of PPROM. The goal of this research is to identify women at risk of developing this condition early in their pregnancy and to intervene with targeted therapy."

(Photo: Yale U.)

Yale University

Lie Detection: Misconceptions, Pitfalls, and Opportunities for Improvement

0 comentarios
Unlike Pinocchio, liars do not usually give telltale signs that they are being dishonest. In lieu of a growing nose, is there a way to distinguish people who are telling the truth from those who aren’t? A new report in Psychological Science in the Public Interest, a journal of the Association for Psychological Science, discusses some of the common misconceptions about those proficient in the art of deception, reviews the shortcomings of commonly used lie-detection techniques, and presents new empirically supported methods for telling liars from truth-tellers with greater accuracy.

Trapping a liar is not always easy. Lies are often embedded in truths and behavioral differences between liars and truth-tellers are usually very small. In addition, some people are just very good at lying. Lie detectors routinely make the common mistakes of overemphasizing nonverbal cues, neglecting intrapersonal variations (i.e., how a person acts when they are telling the truth versus when they are lying), and being overly confident in their lie-detection skills.

In this report, Aldert Vrij of the University of Portsmouth, Anders Granhag of the University of Gothenburg, and Stephen Porter of the University of British Columbia review research suggesting that verbal methods of deception detection are more useful than nonverbal methods commonly believed to be effective, and that there are psychological differences between liars and truth-tellers that can be exploited in the search for the truth.

In an information-gathering interview suspects are asked to give detailed statements about their activities through open questions—for example, “What did you do yesterday between 3 p.m. and 4 p.m.?” This interview style encourages suspects to talk and allows for opportunities to identify inconsistencies between the answer and available evidence. Asking very specific questions that a suspect is unlikely to anticipate may also help in lie detection.

Lying can be more cognitively demanding than truth-telling—it requires more brain power to come up with a lie and keep track of it (e.g., who was told what) than it does to tell the truth. Imposing cognitive load on interviewees by asking them to recall the events in reverse order may also be useful in weeding out liars from those telling the truth.

This research has important implications in a variety of settings, including the courtroom, police interviews, and screening individuals with criminal intent, for instance, identifying potential terrorists.

Association for Psychological Science

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com