Thursday, July 9, 2009

BACKTRACKING ON DNA

0 comentarios

Accuracy is essential for life, so in converting the information stored in DNA into a form in which it can be used, a high level of precision is required. Dr Tanniemola Liverpool from the Department of Mathematics, working with colleagues from the University of Leeds, has developed a mathematical model for how the required accuracy is achieved.

A gene is the basic physical and functional unit of heredity. Genes are made up of DNA, which is the hereditary material in humans and almost all other organisms. Nearly every cell in a person’s body has the same DNA, and the information in DNA is stored as a code made up of four chemical bases: adenine (A), guanine (G), cytosine (C), and thymine (T). Human DNA consists of about three billion bases, and more than 99 percent of those bases are the same in all people. The order, or sequence, of these bases determines the information available for building and maintaining an organism, similar to the way in which letters of the alphabet appear in a certain order to form words and sentences.

DNA bases pair up with each other, A with T and C with G, to form units called base pairs. Each base is also attached to a sugar molecule and a phosphate molecule. Together, a base, sugar, and phosphate are called a nucleotide. Nucleotides are arranged in two long strands that form a spiral called a double helix. The structure of the double helix is like a ladder, with the base pairs forming the ladder’s rungs and the sugar and phosphate molecules forming the vertical sidepieces of the ladder.

An important property of DNA is that it can replicate, or make copies of itself. Each strand of DNA in the double helix can serve as a blueprint for duplicating the sequence of bases. This is critical when cells divide because each new cell needs to have an exact copy of the DNA present in the old cell. The journey from gene to protein is complex and consists of two major steps: transcription and translation, which together are known as gene expression.

During the process of transcription, the information stored in a gene’s DNA is transferred to a similar molecule called RNA in the cell nucleus. During transcription errors occasionally occur, and these can lead to defects in the protein being manufactured. In fact, an error of only 1 in 100,000 nucleotides is enough to give rise to proteins that do not function, which would lead ultimately to cell death.

The process of transcription is carried out by specialized enzymes known as RNA polymerases (RNAP) that move along the DNA, base by base. However, due to thermal fluctuations within the cell, the chemical reactions involved in transcription don't always follow the most likely (minimum energy) path and consequently there is a probability of around 1 in a 1,000 that a base-pair is incorrectly transcribed. Therefore, in order for cells to maintain the high level of accuracy required for life, they must have a mechanism for dealing with errors. Liverpool and colleagues have developed a theoretical model for how the required accuracy is achieved.

In the early 70s it was pointed out that in order to keep track of the information required to proof read (ie, check for and correct errors), the cell had to have processes that dissipate (waste) energy, otherwise they would contradict the second law of thermodynamics which states that unless energy is supplied, disorder (ie, lack of information) tends to increase. Experiments have suggested a number of possible mechanisms by which this might occur, but it is difficult to figure out which, if any, is the primary one, so exactly how these processes occur remains a mystery.

Liverpool’s paper, published today in Physical Review Letters, develops a mathematical model for proofreading in the transcription process, based on the fact that the RNAP does not move only in one direction along the DNA, but often makes random backward excursions as it transcribes the gene. The model shows how these ‘backtracks’ can improve the accuracy of transcription, and predicts the dependence of the probability of finding errors on the backtracking dynamics.

The results of the model suggest future experiments which can be used to discriminate between the different possible mechanisms. They should also shed light on error correction in other biological processes such as the translation of RNA to protein.

(Photo: Bristol U.)

University of Bristol

FIRST ACOUSTIC METAMATERIAL 'SUPERLENS' CREATED BY U. OF I. RESEARCHERS

0 comentarios

A team of researchers at the University of Illinois has created the world’s first acoustic “superlens,” an innovation that could have practical implications for high-resolution ultrasound imaging, non-destructive structural testing of buildings and bridges, and novel underwater stealth technology.

The team, led by Nicholas X. Fang, a professor of mechanical science and engineering at Illinois, successfully focused ultrasound waves through a flat metamaterial lens on a spot roughly half the width of a wavelength at 60.5 kHz using a network of fluid-filled Helmholtz resonators.

According to the results, published in the May 15 issue of the journal Physical Review Letters, the acoustic system is analogous to an inductor-capacitor circuit. The transmission channels act as a series of inductors, and the Helmholtz resonators, which Fang describes as cavities that house resonating waves and oscillate at certain sonic frequencies almost as a musical instrument would, act as capacitors.

Fang said acoustic imaging is somewhat analogous to optical imaging in that bending sound is similar to bending light. But compared with optical and X-ray imaging, creating an image from sound is “a lot safer, which is why we use sonography on pregnant women,” said Shu Zhang, a U. of I. graduate student who along with Leilei Yin, a microscopist at the Beckman Institute, are co-authors of the paper.

Although safer, the resultant image resolution of acoustic imaging is still not as sharp or accurate as conventional optical imaging.

“With acoustic imaging, you can’t see anything that’s smaller than a few millimeters,” said Fang, who also is a researcher at the institute. “The image resolution is getting better and better, but it’s still not as convenient or accurate as optical imaging.”

The best tool for tumor detection is still the optical imaging, but exposure to certain types of electromagnetic radiation such as X-rays also has its health risks, Fang noted.

“If we wish to detect or screen early stage tumors in the human body using acoustic imaging, then better resolution and higher contrast are equally important,” he said. “In the body, tumors are often surrounded by hard tissues with high contrast, so you can’t see them clearly, and acoustic imaging may provide more details than optical imaging methods.”

Fang said that the application of acoustic imaging technology goes beyond medicine. Eventually, the technology could lead to “a completely new suite of data that previously wasn’t available to us using just natural materials,” he said.

In the field of non-destructive testing, the structural soundness of a building or a bridge could be checked for hairline cracks with acoustic imaging, as could other deeply embedded flaws invisible to the eye or unable to be detected by optical imaging.
“Acoustic imaging is a different means of detecting and probing things, beyond optical imaging,” Fang said.

Fang said acoustic imaging could also lead to better underwater stealth technology, possibly even an “acoustic cloak” that would act as camouflage for submarines. “Right now, the goal is to bring this ‘lab science’ out of the lab and create a practical device or system that will allow us to use acoustic imaging in a variety of situations,” Fang said.

(Photo: L. Brian Stauffer)

University of Illinois

BEYOND CO2: STUDY REVEALS GROWING IMPORTANCE OF HFCS IN CLIMATE WARMING

0 comentarios

Some of the substances that are helping to avert the destruction of the ozone layer could increasingly contribute to climate warming, according to scientists from NOAA’s Earth System Research Laboratory and their colleagues in a new study published in the journal Proceedings of the National Academy of Sciences.

The authors took a fresh look at how the global use of hydrofluorocarbons (HFCs) is expected to grow in coming decades. Using updated usage estimates and looking farther ahead than past projections (to the year 2050), they found that HFCs — especially from developing countries — will become an increasingly larger factor in future climate warming.

“HFCs are good for protecting the ozone layer, but they are not climate friendly,” said David W. Fahey, a scientist at NOAA and second author of the new study. “Our research shows that their effect on climate could become significantly larger than we expected, if we continue along a business-as-usual path.”

HFCs currently have a climate change contribution that is small (less than 1 percent) in comparison to the contribution of carbon dioxide (CO2) emissions. The authors have shown that by 2050 the HFCs contribution could rise to 7 to 12 percent of what CO2 contributes. And if international efforts succeed in stabilizing CO2 emissions, the relative climate contribution from HFCs would increase further.

HFCs, which do not contain ozone-destroying chlorine or bromine atoms, are used as substitutes for ozone-depleting compounds such as chlorofluorocarbons (CFCs) in such uses as refrigeration, air conditioning, and the production of insulating foams. The Montreal Protocol, a 1987 international agreement, has gradually phased out the use of CFCs and other ozone-depleting substances, leading to the development of long-term replacements such as HFCs.

Though the HFCs do not deplete the ozone layer, they are potent greenhouse gases. Molecule for molecule, all HFCs are more potent warming agents than CO2 and some are thousands of times more effective. HFCs are in the “basket of gases” regulated under the 1997 Kyoto Protocol, an international treaty to reduce emissions of greenhouse gases.

The new study factored in the expected growth in demand for air conditioning, refrigerants, and other technology in developed and developing countries. The Montreal Protocol’s gradual phasing out of the consumption of ozone-depleting substances in developing countries after 2012, along with the complete phase-out in developed countries in 2020, are other factors that will lead to increased usage of HFCs and other alternatives.

Decision-makers in Europe and the United States have begun to consider possible steps to limit the potential climate consequences of HFCs. The PNAS study examined several hypothetical scenarios to mitigate HFC consumption. For example, a global consumption limit followed by a four percent annual reduction would cause HFC-induced climate forcing to peak in the year 2040 and then begin to decrease before the year 2050.

“While unrestrained growth of HFC use could lead to significant climate implications by 2050, we have shown some examples of global limits that can effectively reduce the HFCs’ impact,” said John S. Daniel, a NOAA coauthor of the study.

(Photo: NOAA)

National Oceanic and Atmospheric Administration

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com