Thursday, July 2, 2009


0 comentarios

Duke University engineers have taken a first step toward a minimally invasive treatment of brain tumors by combining chemotherapy with heat administered from the end of a catheter.

The proof-of-concept study demonstrated that it should be technically possible to treat brain tumors without the side effects associated with the traditional approaches of surgery, systemic chemotherapy or radiation.

The bioengineers designed and built an ultrasound catheter that can fit into large blood vessels of the brain and perform two essential functions: provide real-time moving 3-D images and generate localized temperature increases. The researchers envision using this system in conjunction with chemotherapy drugs encased in heat-sensitive microbubbles called liposomes.

“Physicians would inject drug-carrying liposomes into a patient’s bloodstream, and then insert a catheter via a blood vessel to the site of the brain tumor,” said Carl Herickhoff, fourth-year graduate student at Duke’s Pratt School of Engineering and first author of a paper appearing in the journal Ultrasonic Imaging. “The catheter would use ultrasound to first image the tumor, and then direct a higher-power beam to generate heat at the site, melting the liposome shells and releasing the chemotherapy directly to the tumor.

“The temperature increase would be about four degrees Celsius above body temperature – enough to melt the liposome, but not enough to damage surrounding tissue,” Herickhoff said. “No one has tried this approach before in the brain.”

The American Cancer Society estimates that more than 21,000 new brain tumor cases were diagnosed in 2008, with more than 13,000 patients dying. This represents about two percent of all cancer deaths.

The researchers said that a minimally invasive approach to treating this cancer would be preferable to the conventional methods, which have drawbacks and side effects of their own.

“Surgery is invasive, and chemotherapy that is injected or taken orally affects the whole body and has difficulty crossing the blood-brain barrier in sufficient concentrations,” Herickhoff said. The blood-brain barrier restricts the passage into the brain of any foreign matter not needed by the neural tissue.

In a series of experiments in animal models and simulated tissues, the researchers demonstrated that they could build a catheter thin enough to be placed in one of the brain’s main blood vessels that was capable of serving the dual purpose of visualization and heating.

“Taken as a whole, the results of these experiments, in particular the clarity of the images and ability to increase temperature with the same catheter, lead us to believe that the ultimate creation of a practical intracranial ultrasound catheter is possible,” said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. “There are some design issues of the catheter itself that we feel can be overcome with little difficulty.”

Advances in ultrasound technology have made these latest experiments possible, the researchers said, by generating detailed, 3-D moving images in real-time. The Duke laboratory has a long track record of modifying traditional 2-D ultrasound – like that used to image babies in utero – into the more advanced 3-D scans. After inventing the technique in 1991, the team also has shown its utility in developing specialized catheters and endoscopes for real-time imaging of tissues throughout the body.

(Photo: David Needham, Pratt School of Engineering)

Duke University


0 comentarios
We judge probability and make risk judgments all the time, such as when we try new products or consider which stocks to trade. It would seem that our decisions would be rational and based on concrete factors; however, we are not always so pragmatic. Some judgments are not based solely on relevant information but can be influenced by subjective beliefs.

For example, most of us would probably cringe at the thought of drinking a sugar solution that was labeled "sodium cyanide," even if we knew it was perfectly safe to drink. According to new research by consumer psychologists Arul Mishra and Himanshu Mishra from the University of Utah and Dhananjay Nayakankuppam from the University of Iowa, something as mundane as how objects are grouped together can have a significant impact on the decisions we make.

Volunteers selected a mug from one of two groups. In one group, the wrapped-up mugs were spaced far apart, while in the other group they were closer together. Some of the volunteers were told that one of the mugs was defective while the other volunteers were told that one of the mugs contained a gift coupon.

The volunteers who were told that one of the mugs contained a gift coupon selected from the mugs which were close together. Conversely, the volunteers who were informed that one of the mugs was defective chose from the group of mugs that were spaced far apart.

The researchers then performed a follow-up experiment: volunteers had to choose among ketchup bottles (as before, the bottles were in two groups, close together or spaced farther apart). This time, some of the participants were told that either one or three of the bottles had defective lids, while the remaining participants were told that either one or three of the bottles contained gift coupons. It turns out that the volunteers who were told that three of the bottles had defective lids were the most likely to choose from the spaced apart group and the volunteers who thought that three of the bottles contained gift coupons were the most likely to choose from the closely spaced group.

These results, reported in Psychological Science, a journal of the Association for Psychological Science, reveal that we tend to view products that are grouped close together as being "contagious." It appears that if one of the products has a prominent good or bad quality, we will see that quality as spreading among other objects which are close by, a phenomenon known as the "group-contagion effect." As the authors noted, these findings suggest that people tend "to choose from groups of closely arranged products in the gain domain and from groups of widely spaced products in the loss domain."

Psychological Science


0 comentarios
Researchers from Rothamsted Research, an institute of the Biotechnology and Biological Sciences Research Council in the UK, working with the health authority in the Peruvian Amazon, have pioneered a new way of controlling the mosquito that carries the potentially deadly dengue virus. They forced adult Aedes aegypti mosquitoes to transfer insecticides to their own breeding sites, thereby killing any larvae developing there.

The juvenile stages of all mosquitoes develop in aquatic habitats. Emerging adults have to return there to lay their eggs and continue the life cycle. These habitats are key targets for mosquito and disease control campaigns but, because of the cryptic and myriad nature of potential breeding sites, their treatment with insecticides is usually difficult, time consuming, and expensive.

Scientists were able to achieve almost total coverage of the aquatic larval habitat by treating a small proportion of the area where adult mosquitoes rest with a safe, potent and persistent insecticide. This insecticide can be carried by adult mosquitoes but only kills juvenile stages. Amplification of the effect occurs because every adult mosquito completes several resting and egg-laying cycles during its lifetime. This results in multiple opportunities for contamination of the aquatic habitat.

The use of the adult mosquito as the transfer vehicle ensures that the larvicides are very accurately targeted: the more popular the breeding site, the greater the transfer of insecticide and the more effective the control.

The technique is truly novel, and could be implemented immediately. One of the researchers at the Ifakara Health Institute in Tanzania has developed a mathematical model of the process to explore how the Peruvian team might apply their technique to the mosquito species which carry malaria and filariasis.

Bioscience for the future


0 comentarios

Researchers here have used sediment from the deep ocean bottom to reconstruct a record of ancient climate that dates back more than the last half-million years.

The record, trapped within the top 20 meters (65.6 feet) of a 400-meter (1,312-foot) sediment core drilled in 2005 in the North Atlantic Ocean by the Integrated Ocean Drilling Program, gives new information about the four glacial cycles that occurred during that period.

The new research was presented at the Chapman Conference on Abrupt Climate Change at Ohio State University’s Byrd Polar Research Center. The meeting is jointly sponsored by the American Geophysical Union and the National Science Foundation.

Harunur Rashid, a post-doctoral fellow at the Byrd Center, explained that experts have been trying to capture a longer climate record for this part of the ocean for nearly a half-century. “We’ve now generated a climate record from this core that has a very high temporal resolution, one that is decipherable at increments of 100 to 300 years,” he said.

While climate records from ice cores can show resolutions with individual annual layers, ocean sediment cores are greatly compressed with resolutions sometimes no finer than millennia

“What we have is unprecedented among marine records.”

Dating methods such as carbon-14 are useless beyond 30,000 years or so, he said, so Rashid and his colleagues used the ratio of the isotopes oxygen-16 to oxygen-18 as a proxy for temperature in the records. The isotopes were stored in the remains of tiny sea creatures that fell to the ocean bottom over time.

When the researchers compared their record of past climate from the North Atlantic to a similar record taken from an ice core drilled from Dome C in Antarctica, they found it was remarkably similar.

“You can’t miss the similarity between the two records, one from the bottom of the North Atlantic Ocean and the other from Antarctica,” he said. “The record is virtually the same regardless of the location.”

Surprisingly, Rashid’s team was also able to score another first with their analysis of this sediment core – a record of the temperature at the sea surface in the North Atlantic.

They drew on knowledge readily known to chemists that the amount of magnesium trapped in calcite crystals can indicate the temperatures at which the crystals formed. The more magnesium present, the warmer the waters were when the tiny organisms were alive.

They applied this analysis to the remains of the benthic organisms in the cores and were able to develop a record of warming and cooling of the sea surface in the North Atlantic for the last half-million years.

Having this information will be useful as scientists try to understand how quickly the major ocean currents shifted as glacial cycles came and went, Rashid said.

The researchers were also able to gauge the extent of the ancient Laurentide Ice Sheet that covered much of North America during the last 130,000 years.

As that ice sheet calved off icebergs into the Atlantic, Rashid said that the “dirty underbelly” of those icebergs carried gravel out into the ocean. As the bergs melted, the debris fell to the bottom and of the ocean floor. The more debris present, the more icebergs had been released to carry it, meaning that the ice sheet itself had to have been larger.

“Based on this, we’ve determined that the Laurentide Ice Sheet was probably largest during the last glacial cycle than it was during any of the three previous cycles,” he said.

During the last glacial cycle, the Laurentide Ice Sheet was more than a kilometer (.6 miles) thick and extended to several miles north of Ohio State.

(Photo: OSU)

The Ohio State University


0 comentarios

Astronomers using the W. M. Keck Observatory have discovered distant galaxies as massive as the Milky Way yet ten to 1000 times more compact. The new results, announced June 9 at the 214th American Astronomical Society meeting in Pasadena, provide astronomers with surprising clues about early star and galaxy formation at a time when the Universe was just a few billion years old.

“The shapes of these galaxies tell us that it is not reasonable to expect they could occur from mergers. Instead, the kind of disks we’re seeing and the constituent stars seemed to have formed all at once, directly from the gas. In the old lingo, this is monolithic galaxy formation,” said astronomer Alan Stockton of the University of Hawai’i.

He and his colleagues Dr. Gabriela Canalizo of the University of California, Riverside and Dr. Elizabeth McGrath of the University of California, Santa Cruz used the Keck II telescope and its Laser Guide Star Adaptive Optics, or LGSAO, to image radio galaxies and quasars that are roughly 11 billion light years from Earth.

The Keck LGSAO system uses a powerful laser to excite sodium atoms in the upper atmosphere so that they emit light and appear as an artificial star. Astronomers use this artificial starlight to analyze how the atmosphere is distorting incoming light from their target astronomical sources. The distortion can then be corrected using a compensating distortion in a deformable mirror in the adaptive optics system.

From these AO-corrected observations of distant galaxies, Stockton and his colleagues could model the detailed structures of their target galaxies, which are quite unlike those of massive galaxies in the present-day Universe. The team found the objects had masses that were a hundred billion times the mass of the Sun, yet were compact and have diameters of roughly 3,000 to 15,000 light years. By comparison, the diameter of the Milky Way is 100,000 light years, yet it has a mass of about 500 billion solar masses.

Teams using the Hubble Space Telescope have also found that high redshift galaxies tend to be more compact than astronomers expected. Stockton said his team was able to obtain near-infrared images from the ground that were almost two times sharper than those they could obtain with the Hubble Space Telescope at similar wavelengths. These Keck LGSAO images allow not only the measurement of characteristic sizes of the distant galaxies, but also more detailed properties of the light distribution that may give clues to formation processes.

For example, Stockton’s team imaged a field of five galaxies two of which show a tidal tail that would be indistinguishable with Hubble. “The tail, however, can only form if the galaxies we observe were disk galaxies,” Stockton said. “This Keck data gives us further evidence of that conclusion.”

Astronomers expected that distant galaxies might be disk galaxies and would be more compact than today’s galaxies. They did not expect the galaxies to be as dense as Stockton’s observations indicate, and researchers have not yet identified objects in the local Universe that resemble these compact disk galaxies. This is surprising because dense, disk-like objects are like cannon balls and are therefore not easily destroyed by collisions, meaning some should survive today.

“It might therefore be possible that these disk galaxies have instead become the cores of today’s galaxies,” Stockton said.

The data cannot yet answer this or other questions about the morphology and evolution of these two billion-year-old galaxies. Stockton said that he is currently trying to obtain clearer spectral data of the distant galaxies to determine how fast their constituent stars are moving about their centers—this will enable astronomers to independently determine the galaxies’ masses. His team is also currently looking for examples of very compact galaxies that have survived to a time when the Universe was half its present age, about seven billion years old. It will be possible to obtain much more detailed observations of such galaxies, which may lead to a better physical understanding of these objects. Observations to find disk galaxies at more distant redshifts will also be done to determine if disk galaxies exist in the very early Universe, Stockton said.

(Photo: Alan Stockton, UH/WMKO)

Keck Observatory


0 comentarios

The high cost of manufacturing fuel cells makes their large-scale production for power generation next to impossible, but researchers at Arizona State University are working to change that so cars, electricity and much more can run on the “green” technology.

Engineering technology professor Arunachalanadar Madakannan (Kannan) has been studying the proton exchange membrane fuel cells (PEMFC) for more than eight years. The fuel cells Kannan and his graduate students are focusing on employ carbon nanotube-based catalysts and electrodes.

Fuels cells, which cleanly and quietly generate electric power by passing fuels like hydrogen over one electrode while passing air over a second electrode, have been around for more than 100 years. But their development has long been dogged by costs of the technology as well as safety concerns.

Kannan said PEMFC fuel cells have layers of electrode and electrolyte components. In a PEMFC, the cell is made up of an hydrogen-based anode (positive) terminal and oxygen-based cathode (negative) terminal, with carbon-particle supported platinum acting as a catalyst (electrode) to produce power. While fuel cells produce electrical energy, the only waste generated is water, so it’s considered a very clean energy conversion system.

Scientists have been honing fuel cell technology since its inception, but, even after more than a century, the cost of producing fuel cells remains high because of the platinum-based catalysts.

“Platinum is the most effective electrocatalyst and a good conductor of electricity in fuel cells, but the cost is so prohibitive that we have not yet been able to use fuel cells widely,” says Kannan, an associate professor in the College of Technology and Innovation at ASU’s Polytechnic campus.

Kannan is working to create lower cost PEMFCs by directly growing carbon nanotubes on carbon paper substrates, otherwise known as the gas diffusion layer, rather than spherical carbon particles and then deposit platinum nanoparticles onto the surface of the nanotubes. This innovative approach allows for the use of less platinum, without impacting energy efficiency.

“This modified process saves about 10 to 15 percent of the cost compared to what exists today, without sacrificing any power output,” says Kannan.

During his research, Kannan was evaluating the performance of several different materials, measuring power output and efficiency along the way.

“The carbon nanotube-based electrode is more efficient because it has a greater surface area,” says Kannan, “which allows for less platinum to be needed. In addition, the electrodes also perform extremely well under lower relative humidity, which will ultimately reduce the fuel cell system complexity.”

Kannan co-authored three papers on the topic, which were all recently published in the Journal of Power Sources as well as the International Journal of Hydrogen Energy.

In addition, ASU and Helsinki University of Technology along with VTT in Finland have entered into a project regarding an advanced material solution for PEMFCs. Currently ASU graduate student Chad Mason is in Finland testing and improving the performance of the gas diffusion layer materials, while lowering costs and increasing manufacturability.

“The next step is to make the development of the gas diffusion layer continuous, rather than a batch process, so that it can be commercially viable,” says Kannan. “Chad’s work overseas will allow us to move in this direction. We believe that PEM fuel cells will become commercially viable in a decade or so and help us move toward a hydrogen economy.”

(Photo: ASU)

Arizona State University




Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com