Wednesday, December 2, 2009

FOR IMPROVING EARLY LITERACY, READING COMICS IS NO CHILD'S PLAY

0 comentarios

Although comics have been published in newspapers since the 1890s, they still get no respect from some teachers and librarians, despite their current popularity among adults. But according to a University of Illinois expert in children’s literature, critics should stop tugging on Superman’s cape and start giving him and his superhero friends their due.

Carol L. Tilley, a professor of library and information science at Illinois, says that comics are just as sophisticated as other forms of literature, and children benefit from reading them at least as much as they do from reading other types of books.

“A lot of the criticism of comics and comic books come from people who think that kids are just looking at the pictures and not putting them together with the words,” Tilley said. “Some kids, yes. But you could easily make some of the same criticisms of picture books – that kids are just looking at pictures, and not at the words.”

Although they’ve long embraced picture books as appropriate children’s literature, many adults – even teachers and librarians who willingly add comics to their collections – are too quick to dismiss the suitability of comics as texts for young readers, Tilley said.

“Any book can be good and any book can be bad, to some extent,” she said. “It’s up to the reader’s personality and intellect. As a whole, comics are just another medium, another genre.”

Critics would say that reading comics is actually a simplified version of reading that doesn’t approach the complexity of “real” books, with their dense columns of words and relative lack of pictures. But Tilley argues that reading any work successfully, including comics, requires more than just assimilating text.

“If reading is to lead to any meaningful knowledge or comprehension, readers must approach a text with an understanding of the relevant social, linguistic and cultural conventions,” she said. “And if you really consider how the pictures and words work together in consonance to tell a story, you can make the case that comics are just as complex as any other kind of literature.”

Tilley said some of the condescension toward comics as a medium may come from the jejune connotations that the name itself evokes.

“The term ‘comic’ is somewhat pejorative and tends to denote the child-like and ephemeral, and it brings to mind the Sunday funnies that you used to line your birdcage,” she said.

The term “graphic novel” is sometimes used to give comics a measure of respectability, Tilley said. But some artists, including Pulitzer-Prize winner Art Spiegelman, hate the term.

“They feel it’s just a dressed up euphemism for comics,” she said.

Despite their popularity among juveniles in the early twentieth century, comic strips of that era were written and drawn primarily for an adult readership.

“Comics were originally an adult medium, since newspapers reached a primarily adult audience, but they very quickly turned into something that was appropriated by kids,” Tilley said. “Certainly by the first decade of the 20th century it had become a kids’ medium.”

According to Tilley, even in the early 1900s, there were teachers who raised concerns about children reading comics – that their content wasn’t appropriate content for a children, and that it wasn’t real literature.

And when the first comic books were published as omnibus collections of popular published comic strips in the mid-1930s, “the same concerns sprang up again from adults,” Tilley said.

“They claimed the texts weren’t good texts because they used slang, there were misspellings, they used colloquialisms and that the pictures were of questionable merit.”

In 1955, after a sustained outcry over the suitability of comics as children’s reading materials, the comics industry instituted a restrictive editorial code. Soon thereafter, juvenile readership plummeted.

“Between 1955 and the last 10 years, it became very much an adult medium,” Tilley said. “Part of that was because the comics code watered down what could be sold in drugstores, and also because they were slowly getting out of the affordable price range for kids. Comic books became incredibly tame, and the more sophisticated comics were direct sales to adults from the comics publishers.”

In 1940, a comic book was 10 cents, while the average hardcover juvenile book was $2.

“That’s a 20-to-1 price ratio. Now it’s not quite so generous – maybe 4- or 5-to-1. As it’s become an adult-focused format, kids have been priced out of the market.”

Recently, many publishers and creators of comics – including Spiegelman and another Pulitzer Prize winner, Michael Chabon – have advocated reconnecting a juvenile audience with comics.

So far, those efforts have met with mixed success.

“If you look at the comics that are being mass-marketed to kids,” Tilley said, “it’s mild, tame stuff with a strong commercial tie-in to another media format. There aren’t many stand-alone titles unless you go to comic book store.”

The one exception is Manga, the Japanese version of comic books that has its own unique artistic and narrative style whose influence can be seen in the “Astro Boy” and “Sailor Moon” franchises.

“You are going to find a wide selection of Manga at most bookstores,” Tilley said. “That’s another part of comics that has taken off – one that kids have claimed as the format of choice for themselves.”

Although commercial publishers of comics have yet to recapture children’s imaginations, Tilley says that some librarians and teachers are increasingly discovering that comics can be used to support reading and instruction.

“In the last 15 years, we’ve seen some big changes. For instance, comic book publishers and distributors are showing up at library conferences and some review journals regularly evaluate graphic novels. That would have been unimaginable 20 years ago. So it has caught on, to some degree.”

Public libraries collect comics and graphic novels much more than school libraries, primarily because of decreases in funding and emphasis on strong ties to the curriculum through No Child Left Behind.

“Comics tend to be omitted under those circumstances,” Tilley said.

Despite their marginalization, Tilley said the distinct comic book aesthetic – frames, thought and speech bubbles, motion lines, to name a few – has been co-opted by children’s books, creating a hybrid format.

“There has been an increase in the number of comic book-type elements in books for younger children,” Tilley said. “There’s also a greater appreciation among both teachers and librarians for what comics and comic books can bring to the classroom. For example, the National Council of Teachers of English sponsors an instructional Web site called ‘Read, Write, Think,’ which has a lot of comics-related material. Instructional units like these would have been much more rare 10 years ago.”

(Photo: L. Brian Stauffer)

University of Illinois

NJIT MECHANICAL ENGINEER DISCOVERS WHY PARTICLES LIKE FLOUR DISPERSE ON LIQUIDS

0 comentarios

Even if you are not a cook, you might have wondered why a pinch of flour (or any small particles) thrown into a bowl of water will disperse in a dramatic fashion, radiating outward as if it was exploding. Pushpendra Singh, PhD, a mechanical engineering professor at NJIT who has studied and written about the phenomenon, has not only thought about it, but can explain why.

He says that what’s known as the repulsive hydrodynamic force arising from the oscillation of particles causes them to disperse. A particle trapped in a liquid surface vibrates up and down from its equilibrium position on the surface, or interface, where air meets water. When many particles do this simultaneously, an explosive dispersion occurs.

Singh says that when small particles, such as flour or pollen, come in contact with a liquid surface, they immediately disperse and form a monolayer. The dispersion occurs so quickly that it appears explosive, especially on the surface of liquids like water.

This explosive dispersion is a consequence of the capillary force pulling particles towards their equilibrium positions in the interface. The capillary force causes the particles to accelerate very rapidly.

“If a particle barely touches the interface, it is pulled onto the surface,” said Singh. “For example, if the contact angle for a spherical particle is 90 degrees, it floats in the state of equilibrium so that one-half of it is above the surface and the remaining half is below. If the particle, however, is not in this position, the capillary force will force it to be.”

What’s interesting is that the smaller the particles, the faster they move. For nanometer-sized particles like viruses and proteins, the velocity or speed on an air-water interface can be as high as 167 kilometers (about 100 miles) per hour.

Singh says the motion of the particles is dominated by inertia because the viscous damping—which is like friction—is too small. He compares the situation to a moving pendulum. “The pendulum will oscillate many times before friction makes it stop,” he says. “If friction is too great, it won’t oscillate.”

Eventually, the particles which have been oscillating around their equilibrium point will stop—thanks to viscous drag which causes resistance to the motion.

“Let me explain more about viscous drag,” said Singh. “When a body, such as a ball, moves through air or liquid, it will resist the motion. This resistance is caused by viscous drag. Or look at it this way. When a particle is adsorbed at a surface, it acquires a part of the released interfacial energy as kinetic energy,” he says. “The particle dissipates this kinetic energy by oscillating from its equilibrium height in the interface. The act gives rise to repulsive hydrodynamic forces, the underlying cause of why particles disperse.”

(Photo: NJIT)

New Jersey Institute of Technology

YOUR OWN STEM CELLS CAN TREAT HEART DISEASE

0 comentarios
The largest national stem cell study for heart disease showed the first evidence that transplanting a potent form of adult stem cells into the heart muscle of subjects with severe angina results in less pain and an improved ability to walk. The transplant subjects also experienced fewer deaths than those who didn't receive stem cells.

In the 12-month Phase II, double-blind trial, subjects' own purified stem cells, called CD34+ cells, were injected into their hearts in an effort to spur the growth of small blood vessels that make up the microcirculation of the heart muscle. Researchers believe the loss of these blood vessels contributes to the pain of chronic, severe angina.

"This is the first study to show significant benefit in pain reduction and improved exercise capacity in this population with very advanced heart disease," said principal investigator Douglas Losordo, M.D., the Eileen M. Foell Professor of Heart Research at the Northwestern University Feinberg School of Medicine and a cardiologist and director of the program in cardiovascular regenerative medicine at Northwestern Memorial Hospital, the lead site of the study.

Losordo, also director of the Feinberg Cardiovascular Research Institute, said this study provides the first evidence that a person's own stem cells can be used as a treatment for their heart disease. He cautioned, however, that the findings of the 25-site trial with 167 subjects, require verification in a larger, Phase III study.

He presented his findings Nov. 17 at the American Heart Association Scientific Sessions 2009.

Out of the estimated 1 million people in the U.S. who suffer from chronic, severe angina -- chest pain due to blocked arteries -- about 300,000 cannot be helped by any traditional medical treatment such as angioplasty, bypass surgery or stents. This is called intractable or severe angina, the severity of which is designated by classes. The subjects in Losordo's study were class 3 or 4, meaning they had chest pain from normal to minimal activities, such as from brushing their teeth or even resting.

The stem cell transplant is the first therapy to produce an improvement in severe angina subjects' ability to walk on a treadmill. Twelve months after the procedure, the transplant subjects were able to double their improvement on a treadmill compared to the placebo group. It also took twice as long until they experienced angina pain on a treadmill compared to the placebo group, and, when they felt pain, it went away faster with rest. In addition, they had fewer overall episodes of chest pain in their daily lives.

In the trial, the CD34+ cells were injected into 10 locations in the heart muscle. A sophisticated electromechanical mapping technology identifies where the heart muscle is alive but not functioning, because it is not receiving enough blood supply.

Northwestern University

ENERGY-SAVING POWDER

0 comentarios

Max Planck chemists are using a simple method to convert methane to methanol - something that has the potential to exploit previously unused reserves of natural gas.

It is currently estimated that natural gas resources will be exhausted in 130 years; however, those reserves where extraction is cost-effective will only flow for another 60 years or so. Scientists at the Max Planck Institute for Coal Research and at the Max Planck Institute of Colloids and Interfaces might be helping to make it worthwhile to tap into previously unused resources. They have developed a catalyst that converts methane to methanol in a simple and efficient process. Methanol can be transported from locations where it is not economical to build a pipeline.

It is not cost-effective to lay pipelines to remote or small natural gas fields; nor is it worthwhile accessing the methane in coal seams or in gas sand, or which is burned off as a by-product of oil production, although the methane burned off throughout the world could more than satisfy Germany’s requirement for natural gas. It is also too expensive to liquefy the gas and transport it on trains or in tankers - and even chemistry has so far been unable to offer a solution. Although there are chemical ways to convert methane to methanol, which is easy to transport and which is suitable as a raw material for the chemical industry, "the processes commonly used up to now for producing diesel fuel - steam reforming followed by methanol synthesis or Fischer-Tropsch synthesis - are not economical," says Ferdi Schüth, Director at the Max Planck Institute for Coal Research in Mülheim an der Ruhr. He and his colleagues have been working with Markus Antonietti and his team at the Max Planck Institute of Colloids and Interfaces in Potsdam to develop a catalyst that might change all this.

The catalyst consists of a nitrogenous material, a covalent, triazine-based network (CTF) synthesized by the chemists in Potsdam. "This solid is so porous that the surface of a gram is approximately equivalent in size to a fifth of a football field," says Markus Antonietti. The researchers in Mülheim insert platinum atoms into the voluminous lattice of the CTF. Thanks to the large surface area, the catalyst oxidizes the methane efficiently to methanol, as it offers the methane a large area in which to react when the chemists immerse it in oxidizing sulphuric acid, force methane into the acid and heat the mixture to 215° Celsius under pressure. Methanol is created from more than three-quarters of the converted gas.

A catalyst manufactured by the American chemist Roy Periana more than ten years ago from platinum and simple nitrogenous bipyrimidine also effectively creates methanol, but only supports the reaction in a soluble form. This means that the catalyst - which chemists refer to as a homogenous catalyst - subsequently needs to be separated off in a laborious and somewhat wasteful process. "It’s much easier with our heterogeneous catalyst," says Ferdi Schüth. The chemists in Mülheim filter out the powdery platinum and CTF catalyst, and then separate the acid and methanol in a simple distillation.

The catalyst developed by the Max Planck chemists probably uses the same mechanism as the Periana catalyst and was indeed inspired by it. "When I saw the structure of CTF, I noticed the elements which correspond to its bipyrimidine ligands," says Schüth. "That’s when I had the idea of manufacturing the solid catalyst."

To get closer to a large-scale technical application, he and his colleagues are now attempting to enable the process to work with reactants in gaseous rather than soluble form. "We are also looking for similar, even more effective catalysts," says Schüth. "We have already found more efficient homogenous catalysts with ligands other than bipyrmidine." They are now using these as a model for simple, easy to manage catalysts like the CTF and platinum powder.

(Photo: iStockphoto LP)

Max-Planck-Gesellschaft, München

WHEN IT COMES TO CO2, WHAT GOES UP ISNT ALWAYS COMING DOWN

0 comentarios

The ocean and the land are natural sponges, or sinks, that absorb carbon dioxide, or CO2, from the atmosphere. But a group of international scientists, including two from NOAA, have found that the emissions are outpacing the ability of the sinks to soak up the excess CO2.

“More CO2 is staying in the atmosphere instead of being absorbed by the ocean and land sinks, like trees and other vegetation,” said Richard Feely, Ph.D., an oceanographer at NOAA’s Pacific Marine Environmental Laboratory in Seattle and an expert on ocean acidification, the change in the ocean’s chemistry because of excess CO2. “We’re concerned that if the natural sinks can't keep pace with the increased CO2 emissions, then the physical and biological impacts of global warming will accelerate over the next century.”

Feely and Thomas Conway, a research chemist at NOAA’s Earth System Research Laboratory in Boulder, Colo., were among a team of 31 scientists who contributed to “Trends in the sources and sinks of carbon dioxide,” published today in Nature Geosciences. The scientists are also members of the Global Carbon Project, an international collaboration that works to develop a complete picture of the global carbon cycle.

Using a variety of data including direct observations, computer-generated models, and estimates from countries’ energy statistics, the team created a global CO2 budget – or amount of CO2 produced and consumed -- from 1959 to 2008. The researchers write that during that time, an average of 43 percent of each year’s CO2 emissions remained in the atmosphere.

The team did note a spike in global CO2 emissions from 2000 and 2008, likely attributed to manufacturing in developing countries, as well as a rising use of coal as fuel.

Unlike other studies that only consider fossil fuel use to measure CO2 produced by human activities, this team included emissions from changing land use, such as deforestation, logging and intensive cultivation of cropland soils, which also emit CO2.

NOAA’s Mauna Loa Observatory has been monitoring CO2 since 1958, when Charles Keeling, after whom the Keeling Curve is named, began analyzing air samples and charting the amount of CO2 in the atmosphere. Those measurements were used in this study and are a vital part of NOAA’s suite of climate services.

NOAA and its national and international partners are working together to better understand the extent of ocean acidification and its effect on coastal and ocean ecosystems. Activities include physical and chemical sensors on ships, moorings and floats track CO2 and pH levels in the ocean and satellites monitor sea surface temperatures.

(Photo: NOAA)

National Oceanic And Atmospheric Administration

SIGNIFICANT OZONE HOLE REMAINS OVER ANTARCTICA

0 comentarios

The Antarctic ozone hole, which fluctuates throughout the late winter and spring in the southern hemisphere, reached its 2009 peak circumference in late September, according to measurements by NOAA researchers. Slightly smaller than the North American continent, the ozone hole covered 9.2 million square miles, according to NOAA satellite observations. This ranks as the 10th largest since satellite measurements began in 1979.

Ozone over South Pole Station, Antarctica, also reached its thinnest point of the year on Sept. 26. Measured in Dobson Units (DU) that indicate the amount of ozone in a vertical column of air, the 2009 low level – 98 DU – is the seventh smallest since 1986. The record low of 89 DU was recorded on Oct. 6, 1993.

The atmospheric ozone layer protects the Earth from harmful ultraviolet radiation. However, it has been damaged by human-produced compounds known as chlorofluorocarbons, or CFCs, which release ozone-destroying bromine and chlorine into the atmosphere. International agreements have strictly limited the use of CFCs since the early 1990s.

“The Montreal Protocol has been effective in reducing emissions of long-lived CFC gases, but high enough concentrations remain in the atmosphere to lead to significant ozone destruction in polar regions,” said Bryan Johnson, project leader for the NOAA Earth System Research Laboratory Ozonesonde Group in Boulder, Colo. “Monitoring ozone over Antarctica provides the essential yardstick to see whether we are on the predicted track for recovery based on the current rate of declining CFCs.” Although CFCs are slowly decreasing in the atmosphere, scientists project that the ozone hole will not fully recover before 2060.

Extreme cold, ice cloud formation in the stratosphere, and a pattern of rapidly circulating air, called the polar vortex, make the ozone layer over Antarctica much more vulnerable to CFC-destruction than anywhere else.

The Antarctic ozone hole reaches its maximum in early spring in the Southern Hemisphere as the South Pole emerges from months of continuous wintertime darkness. At polar dawn, sunlight drives the chemical reactions in which the accumulated chlorine and bromine atoms break down ozone. Scientists expect the ozone hole to gradually disappear as CFC levels fall over the next several decades.

Since 1986, the NOAA Ozonesonde Group has been measuring the extent of the ozone hole vertically over Antarctica with balloons that carry ozone-measuring instruments into the stratosphere. NOAA polar-orbiting satellites probe the area the ozone hole covers as it expands and contracts seasonally.

Work on the ground from NOAA’s South Pole Station can be difficult in temperatures approaching -100 degrees F. NOAA engineer Patrick Cullis and NOAA Corps officer Marc Weekley have nearly completed a year-long assignment at South Pole Station where they collect atmospheric data and keep instruments operating. They launch balloons to measure ozone vertically once each week all year and more frequently, three or more times per week, in the spring.

“It's actually been hardest to launch during the long dawn before sunrise,” Cullis said. “The landscape starts to brighten, but there are no shadows to warn you of the large clumps of snow left in your path by the movement of bulldozers. Plus, this year, the polar dawn brought an intense 35-knot storm lasting over a week. Even during lulls in the storm, launching large plastic balloons was like running a 50-meter dash in soft snow. My lungs would burn from the combination of 10 pounds of gear, soft snow, and thin air.”

Antarctic ozone hole data from these balloon launches and other NOAA measurements is archived online. The measurements and observations are part of NOAA’s suite of climate services.

(Photo: NOAA)

National Oceanic And Atmospheric Administration

TINY BUBBLES CLEAN OIL FROM WATER

0 comentarios

Small amounts of oil leave a fluorescent sheen on polluted water. Oil sheen is hard to remove, even when the water is aerated with ozone or filtered through sand. Now, a University of Utah engineer has developed an inexpensive new method to remove oil sheen by repeatedly pressurizing and depressurizing ozone gas, creating microscopic bubbles that attack the oil so it can be removed by sand filters.

"We are not trying to treat the entire hydrocarbon [oil] content in the water - to turn it into carbon dioxide and water - but we are converting it into a form that can be retained by sand filtration, which is a conventional and economical process," says Andy Hong, a professor of civil and environmental engineering.

In laboratory experiments reported online this week in the journal Chemosphere, Hong demonstrated that "pressure-assisted ozonation and sand filtration" effectively removes oil droplets dispersed in water, indicating it could be used to prevent oil sheen from wastewater discharged into coastal waters.

Hong says the method - for which patents are pending - also could be used to clean a variety of pollutants in water and even soil, including:

•So-called "produced water" from oil and gas drilling sites on land. Such oily water normally is re-injected underground. "If we have technology to clean it, it could be put into beneficial uses, such as irrigation, especially in arid regions where oil and gas tend to be produced," says Hong.
•Water from mining of tar sands and oil shale.
•Groundwater contaminated by MTBE, a gasoline additive that reduces harmful vehicle emissions but pollutes water due to leaking underground gasoline storage tanks.
•"Emerging contaminants," such as wastewater polluted with medications and personal care products.
•Soil contaminated with polychlorinated biphenyls (PCBs, from electrical transformers) or polycyclic aromatic hydrocarbons (PAHs, from fuel burning). Water and contaminated soil would be mixed into slurry, and then treated with the new method.
•Heavy metals in soil. Instead of ozone, air and metal-grabbing chelating agents would be pressurized with a slurry of the contaminated material.
•Refinery wastewater and oil spills at refineries or on waterways. The spill could be vacuumed, and then treated with the new method on-site or on a barge.

Hong conducted the study with two University of Utah doctoral students - Zhixiong Cha, who has earned his Ph.D., and Chia-Jung Cheng - and with Cheng-Fang Lin, an environmental engineering professor at National Taiwan University.
Zapping Oily Water with Microbubbles from Ozone under Pressure

Hong says his method uses two existing technologies - ozone aeration and sand filtration - and adds a big change to the former. Instead of just bubbling ozone through polluted water, Hong uses repeated cycles of pressurization of ozone and dirty water so the ozone saturates the water, followed by depressurization so the ozone expands into numerous microbubbles in the polluted water, similar to the way a carbonated beverage foams and overflows if opened quickly.

The tiny bubbles provide much more surface area - compared with larger bubbles from normal ozone aeration - for the oxygen in ozone to react chemically with oil. Hong says pollutants tend to accumulate on the bubbles because they are not very water-soluble. The ozone in the bubble attacks certain pollutants because it is a strong oxidant. The reactions convert most of the dispersed oil droplets - which float on water to cause sheen - into acids and chemicals known as aldehydes and ketones. Most of those substances, in turn, help the remaining oil droplets clump together so they can be removed by conventional sand filtration, he adds.

In his study, Hong showed the new method not only removes oil sheen, but also leaves the treated water so that any remaining acids, aldehydes and ketones are more vulnerable to being biodegraded by pollution-eating microbes.

"These are much more biodegradable than the parent compounds," he says.

Hong says the water is clean enough to be discharged after the ozonation and sand filtration, but that some pollution sources may want to use conventional methods to biodegrade remaining dissolved organic material.

Hong conducted his experiments using a tabletop chemical reactor that contained about a quart of oily water made by mixing deionized water with crude oil from the Rangely oil field in northwestern Colorado.

Ozone was produced by passing dry air through a high-voltage field, converting oxygen gas, which has two oxygen atoms, into ozone, which has three.

The ozone was pressurized to 10 times atmospheric pressure, about 150 pounds per square inch, which compares with inflation pressures of about 100 PSI for Hong's bicycle and 35 to 40 PSI for many automobile tires.

He found oily water was cleaned most effectively by pressurizing and depressurizing it with ozone gas 10 times, then filtering it through sand, then putting the water through 20 more pressurized ozone cycles, and then filtering it again through sand. That was at flow rates of 10 to 20 liters per minute [about 2.6 to 5.3 U.S. gallons per minute] in his laboratory apparatus.

Hong tested how well the ozonation worked by measuring chemical and biological oxygen demands of treated water samples. Both indirectly measure organic contents in the water. Hong also used mass spectrometry to identify what contaminants remained in the water.

He found that his most effective procedure removed 99 percent of the turbidity from the "produced water" - leaving it almost as clear as drinking water - and removed 83 percent of the oil, converting the rest to dissolved organic acids removable by biodegradation.

With success in the laboratory, Hong now plans for larger-scale pilot tests.

"It is economical and it can be scaled up," he says.

One such test will be done in Wuxi, China, where a prototype desk-sized device capable of treating 200 liters [53 U.S. gallons] at a time will be tested at three to five polluted industrial sites that the government vacated for redevelopment, Hong says.

Meanwhile, the University of Utah Research Foundation has entered into options to license the technology to Miracotech, Inc., of Albany, Calif., and 7Rev, L.P., a Salt Lake City venture capital group, so the companies can bring the technology to market.

Hong says other methods of treating oil well "produced water" have met with varying degrees of success. They include centrifuges, membranes, regular ozonation and air bubbles to float off contaminants. But all have drawbacks, such as inability to handle dissolved oil or high levels of oil, or susceptibility to quickly getting fouled by the oil.

Neither ozonation nor sand filtration alone has been able to effectively treat oily "produced water." Hong says long-chain oil molecules don't react with ozone easily without his pressure treatment. And sand filters alone cannot remove oil.

(Photo: College of Engineering, University of Utah)

University of Utah

CIGARETTES HARBOR MANY BACTERIA HARMFUL TO HUMAN HEALTH

0 comentarios

Cigarettes are "widely contaminated" with bacteria, including some known to cause disease in people, concludes a new international study conducted by a University of Maryland environmental health researcher and microbial ecologists at the Ecole Centrale de Lyon in France.

The research team describes the study as the first to show that "cigarettes themselves could be the direct source of exposure to a wide array of potentially pathogenic microbes among smokers and other people exposed to secondhand smoke." Still, the researchers caution that the public health implications are unclear and urge further research.

"We were quite surprised to identify such a wide variety of human bacterial pathogens in these products," says lead researcher Amy R. Sapkota, an assistant professor in the University of Maryland's School of Public Health.

"The commercially-available cigarettes that we tested were chock full of bacteria, as we had hypothesized, but we didn't think we'd find so many that are infectious in humans," explains Sapkota, who holds a joint appointment with the University's Maryland Institute for Applied Environmental Health and the department of epidemiology and biostatistics.

"If these organisms can survive the smoking process - and we believe they can - then they could possibly go on to contribute to both infectious and chronic illnesses in both smokers and individuals who are exposed to environmental tobacco smoke," Sapkota adds. "So, it's critical that we learn more about the bacterial content of cigarettes, which are used by more than a billion people worldwide."

The study will appear in an upcoming edition of the journal Environmental Health Perspectives and the pre-copyedited manuscript has been posted online.

The researchers describe the study as the first snapshot of the total population of bacteria in cigarettes. Previous researchers have taken small samples of cigarette tobacco and placed them in cultures to see whether bacteria would grow. But Sapkota's team took a more holistic approach using DNA microarray analysis to estimate the so-called bacterial metagenome, the totality of bacterial genetic material present in the tested cigarettes.

Among the study's findings and conclusions:

-Commercially available cigarettes show a broad array of bacterial diversity, ranging from soil microorganisms to potential human pathogens;
-The is the first study to provide evidence that the numbers of microorganisms in a cigarette may be as "vast as the number of chemical constituents;"
-Hundreds of bacterial species were present in each cigarette, and additional testing is likely to increase that number significantly;
-No significant variability in bacterial diversity was observed across the four different cigarette brands examined: Camel; Kool Filter Kings; Lucky Strike Original Red; and Marlboro Red;
-Bacteria of medical significance to humans were identified in all of the tested cigarettes and included Acinetobacter (associated with lung and blood infections); Bacillus (some varieties associated with food borne illnesses and anthrax); Burkholderia (some forms responsible for respiratory infections); Clostridium(associated with foodborne illnesses and lung infections); Klebsiella (associated with a variety of lung, blood and other infections); and Pseudomonas aeruginosa (an organism that causes 10 percent of all hospital-acquired infections in the United States).

"Now that we've shown that a pack of cigarettes is loaded with bacteria, we will conduct follow-up research to determine the possible roles of these organisms in tobacco-related diseases." Sapkota says.

For example, do cigarette-borne bacteria survive the burning process and go on to colonize smokers' respiratory systems? Existing research suggests that some hardy bacteria can be transmitted this way, the researchers say. This might account for the fact that the respiratory tracts of smokers are characterized by higher levels of bacterial pathogens. But it's also possible that smoking weakens natural immunity and the bacteria come from the general environment rather than from cigarettes. Further research will be needed to determine the possible health impacts of cigarette-borne bacteria.

(Photo: U. Maryland)

University of Maryland

ON THE CREST OF WAVE ENERGY

0 comentarios

The ocean is a potentially vast source of electric power, yet as engineers test new technologies for capturing it, the devices are plagued by battering storms, limited efficiency, and the need to be tethered to the seafloor.

Now, a team of aerospace engineers is applying the principles that keep airplanes aloft to create a new wave-energy system that is durable, extremely efficient, and can be placed anywhere in the ocean, regardless of depth.

While still in early design stages, computer and scale-model tests of the system suggest higher efficiencies than wind turbines. The system is designed to effectively cancel incoming waves, capturing their energy while flattening them out, providing an added application as a storm-wave breaker.

"Our group was working on very basic research on feedback flow control for years," says lead researcher Stefan Siegel, referring to efforts to use sensors and adjustable parts to control how fluids flow around airfoils like wings. "For an airplane, when you control that flow, you better control flight--for example, enabling you to land a plane on a shorter runway."

A colleague had read an article on wave energy in a magazine and mentioned it to Siegel and the other team members, and they realized they could operate a wave energy device using the same feedback control concepts they had been developing.

Supported by a grant from the National Science Foundation, the researchers developed a system that uses lift instead of drag to cause the propeller blades to move.

"Every airplane flies with lift, not with drag," says Siegel. "Compare an old style windmill with a modern one. The new style uses lift and is what made wind energy viable--and it doesn't get shredded in a storm like an old windmill. Fluid dynamics fixed the issue for windmills, and can do the same for wave energy."

Windmills have active controls that turn the blades to compensate for storm winds, eliminating lift when it is a risk, and preventing damage.

The Air Force Academy researchers used the same approach with a hydrofoil (equivalent to an airfoil, but for water) and built it into a cycloidal propeller, a design that emerged in the 1930s and currently propels tugboats, ferries and other highly maneuverable ships.

The researchers changed the propeller orientation from horizontal to vertical, allowing direct interaction with the cyclic, up and down motion of wave energy. The researchers also developed individual control systems for each propeller blade, allowing sophisticated manipulations that maximize (or minimize, in the case of storms) interaction with wave energy.

Ultimately, the goal is to keep the flow direction and blade direction constant, cancelling the incoming wave and using standard gear-driven or direct-drive generators to convert the wave energy into electric energy. A propeller that is exactly out of phase with a wave will cancel that wave and maximize energy output.

The cancellation will also allow the float-mounted devices to function without the need of mooring, important for deep-sea locations that hold tremendous wave energy potential and are currently out of reach for many existing wave energy designs.

While the final device may be as large as 40 meters across, laboratory models are currently less than a meter in diameter. A larger version of the system will be tested next year at NSF's Network for Earthquake Engineering Simulation (NEES) tsunami wave basin at Oregon State University, an important experiment for proving the efficacy of the design.

(Photo: SSgt Danny Washburn, U.S. Air Force Academy, Department of Aeronautics)

The National Science Foundation

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com