Thursday, November 26, 2009
With recent technological advancements, such as the Wide-Field Camera 3 on the Hubble Space Telescope, there has been an explosion of research of the reionization period, the farthest back in time that astronomers can observe. The Big Bang, 13.7 billion years ago, created a hot, murky universe. Some 400,000 years later, temperatures cooled, electrons and protons joined to form neutral hydrogen, and the murk cleared. Some time before 1 billion years after the Big Bang, neutral hydrogen began to form stars in the first galaxies, which radiated energy and changed the hydrogen back to being ionized. Although not the thick plasma soup of the earlier period just after the Big Bang, this star formation started the reionization epoch. Astronomers know that this era ended about 1 billion years after the Big Bang, but when it began has eluded them and intrigued researchers like lead author Masami Ouchi of the Carnegie Observatories.
The U.S. and Japanese team led by Ouchi used a technique for finding these extremely distant galaxies. “We look for ‘dropout’ galaxies,” explained Ouchi. “We use progressively redder filters that reveal increasing wavelengths of light and watch which galaxies disappear from or ‘dropout’ of images made using those filters. Older, more distant galaxies ‘dropout’ of progressively redder filters and the specific wavelengths can tell us the galaxies’ distance and age. What makes this study different is that we surveyed an area that is over 100 times larger than previous ones and, as a result, had a larger sample of early galaxies (22) than past surveys. Plus, we were able to confirm one galaxy’s age,” he continued. “Since all the galaxies were found using the same dropout technique, they are likely to be the same age.”
Ouchi’s team was able to conduct such a large survey because they used a custom-made, super-red filter and other unique technological advancements in red sensitivity on the wide-field camera of the 8.3-meter Subaru Telescope. They made their observations from 2006 to 2009 in the Subaru Deep Field and Great Observatories Origins Deep Survey North field. They then compared their observations with data gathered in other studies.
Astronomers have wondered whether the universe underwent reionization instantaneously or gradually over time, but more importantly, they have tried to isolate when the universe began reionization. Galaxy density and brightness measurements are key to calculating star-formation rates, which tell a lot about what happened when. The astronomers looked at star-formation rates and the rate at which hydrogen was ionized.
Using data from their study and others, they determined that the star-formation rates were dramatically lower from 800 million years to about one billion years after the Big Bang, than thereafter. Accordingly, they calculated that the rate of ionization would be very slow during this early time, because of this low star-formation rate.
“We were really surprised that the rate of ionization seems so low, which would constitute a contradiction with the claim of NASA’s WMAP satellite. It concluded that reionization started no later than 600 million years after the Big Bang,” remarked Ouchi. “We think this riddle might be explained by more efficient ionizing photon production rates in early galaxies. The formation of massive stars may have been much more vigorous then than in today’s galaxies. Fewer, massive stars produce more ionizing photons than many smaller stars,” he explained.
Carnegie Institution of Washington
“If we had found evidence of photo tampering, then it would have suggested a broader plot to kill JFK,” said Farid, who is also the director of the Neukom Institute for Computational Science at Dartmouth. “Those who believe that there was a broader conspiracy can no longer point to this photo as possible evidence.” Farid added that federal officials long ago said that this image had not been tampered with, but a surprising number of skeptics still assert that there was a conspiracy.
The study will appear in a forthcoming issue of the journal Perception.
Farid and his team have developed a number of digital forensic tools used to determine whether digital photos have been manipulated, and his research is often used by law enforcement officials and in legal proceedings. The tools can measure statistical inconsistencies in the underlying image pixels, improbable lighting and shadow, physically impossible perspective distortion, and other artifacts introduced by photo manipulators. The play of light and shadow was fundamental in the Oswald photo analysis.
“The human brain, while remarkable in many aspects, also has its weaknesses,” says Farid. “The visual system can be quite inept at making judgments regarding 3-D geometry, lighting, and shadows.”
At a casual glance, the lighting and shadows in the Oswald photo appear to many to be incongruous with the outdoor lighting. To determine if this was the case, Farid constructed a 3-D model of Oswald’s head and portions of the backyard scene, from which he was able to determine that a single light source, the sun, could explain all of the shadows in the photo.
“It is highly improbable that anyone could have created such a perfect forgery with the technology available in 1963,” said Farid. With no evidence of tampering, he concluded that the incriminating photo was authentic.
”As our digital forensic tools become more sophisticated, we increasingly have the ability to apply them to historic photos in an attempt to resolve some long-standing mysteries,” said Farid.
(Photo: Darmouth C.)
In this study, the scientists compared recordings of 30 French and 30 German infants aged between two and five days old. While the French newborns more frequently produced rising crying tones, German babies cried with falling intonation. The reason for this is presumably the differing intonation patterns in the two languages, which are already perceived in the uterus and are later reproduced.
In the last trimester of the pregnancy, human fetuses become active listeners. "The sense of hearing is the first sensory system that develops", says Angela Friederici, one of the Directors at the Max Planck Institute. "The mother’s voice, in particular, is sensed early on." However, the fetus’ hearing in the uterus is restricted due to the amniotic fluid. "What gets through are primarily the melodies and intonation of the respective language". In previous work, Professor Friederici’s research team found evidence that the intonation patterns of the respective mother tongue are already ingrained in the brains of four-month-old babies.
Especially large differences exist between spoken German and French. "In French, a lot of words have stress at the end, so that the intonation rises, while in German, it is mostly the opposite", Friederici explains. The word "papa" is pronounced "papa" in French and "papa" in German, for example. Until now it was believed unlikely that this had an influence on newborns' cries as it was assumed that their "crying melody" was influenced by the building up and falling of breath pressure, as in baby chimpanzees, and not by mental representations in the brain. A misassumption, as the analysis of more than 20 hours of babies’ crying from German and French maternity wards shows. The analysis of crying conducted under the supervision of the psychologist Kathleen Wermke from the ZWES showed that the newborns tended to produce the intonation pattern most typical for their respective mother tongue. The crying patterns of the German infants mostly began loud and high and followed a falling curve while the French infants more often cried with a rising tone. This early sensitivity to features of intonation may later help the infants learn their mother tongue, the researchers say. "When they begin to form their first sounds, they can build on melodic patterns that are already familiar and, in this way, don’t have to start from scratch", says the neuropsychologist. The evolutionary roots of this behaviour are older than the emergence of spoken language, the researchers believe. "The imitation of melodic patterns developed over millions of years and contributes to the mother-child bond" says Friederici.
(Photo: MPI für Kognitions- und Neurowissenschaften)
Neal Iverson opened his laboratory's walk-in freezer and said the one-of-a-kind machine inside could help scientists understand how glaciers slide across their beds. And that could help researchers predict how glaciers will react to climate change and contribute to rising sea levels.
Iverson is an Iowa State University professor of geological and atmospheric sciences. He's worked for three years on his big new machine, which is over nine feet tall, that he calls a glacier sliding simulator.
At the center of the machine is a ring of ice about eight inches thick and about three feet across. Below the ice is a hydraulic press that can put as much as 170 tons of force on the ice, creating pressures equal to those beneath a glacier 1,300 feet thick. Above are motors that can rotate the ice ring at its centerline at speeds of 100 to 7,000 feet per year. Either the speed of the ice or the stress dragging it forward can be controlled. Around the ice is circulating fluid - its temperature controlled to 1/100th of a degree Celsius - that keeps the ice at its melting point so it slides on a thin film of water.
As Iverson starts running experiments with the simulator this month, he'll be looking for data that help explain glacier movement.
"For a particular stress, which depends on a glacier's size and shape, we'd like to know how fast a glacier will slide," Iverson said.
Glacier sliding is something that matters far from the ice fields. As the climate warms, Iverson said glaciers slide faster. When they hit coasts, they dump ice into the ocean. And when those icebergs melt they contribute to rising sea levels.
But there's a lot about the process researchers still don't know.
"We can't predict how fast glaciers slide - even to a factor of 10," Iverson said. "We don't know enough about how they slide to do that."
And so Iverson came up with the idea of a glacier in a freezer that allows him to isolate effects of stress, temperature and melt-water on speeds of glacier sliding.
The project is supported by a $529,922 grant from the National Science Foundation. While Iverson had a rough design for the simulator, he said a team of three engineers from the U.S. Department of Energy's Ames Laboratory - Terry Herrman, Dan Jones and Jerry Musselman - improved the design and turned it into a working machine.
Iverson said the machine won't simulate everything about glacier sliding.
"The fact is we can't simulate the real process," he said. "We can only simulate key elements of the process. The purpose of these experiments will be to idealize how the system works and thereby learn fundamentals of the sliding process that can't be learned in the field because of the complexity there."
Iverson, who also does field studies at glaciers in Sweden and Norway, said glaciology needs work on the ground and in the lab. But it's been decades since anybody has attempted the kind of laboratory simulations he'll be doing.
"There hasn't been a device to do this," Iverson said. "And so there haven't been any experiments."
To change that, Iverson is pulling on a coat, hat and gloves and working in his lab's freezer. He has ice rings to build. Equipment to calibrate. And experiments to run.
(Photo: Bob Elbert)
Iowa State University
Scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory have refined a technique to manufacture solar cells by creating tubes of semiconducting material and then "growing" polymers directly inside them. The method has the potential to be significantly cheaper than the process used to make today’s commercial solar cells.
Funding for this research was provided by the Department of Energy’s Office of Basic Energy Sciences and by the NSF-Materials Research Science and Engineering Center at the University of Chicago.
Because the production costs of today's generation of solar cells prevent them from competing economically with fossil fuels, Argonne researchers are working to re-imagine the solar cell's basic design. Most current solar cells use crystalline silicon or cadmium telluride, but growing a high-purity crystal is energy- and labor-intensive, making the cells expensive.
The next generation, called hybrid solar cells, uses a blend of cheaper organic and inorganic materials. To combine these materials effectively, Argonne researchers created a new technique to grow organic polymers directly inside inorganic nanotubes.
At its most basic level, solar cell technology relies on a series of processes initiated when photons, or particles of light, strike semiconducting material. When a photon hits the cell, it excites one electron out of its initial state, leaving behind a "hole" of positive charge.
Hybrid solar cells contain two separate types of semiconducting material: one conducts electrons, the other holes. At the junction between the two semiconductors, the electron-hole pair gets pulled apart, creating a current.
In the study, Argonne nanoscientist Seth Darling and colleagues at Argonne and the University of Chicago had to rethink the geometry of the two materials. If the two semiconductors are placed too far apart, the electron-hole pair will die in transit. However, if they're packed too closely, the separated charges won’t make it out of the cell.
In designing an alternative, scientists paired an electron-donating conjugated polymer with the electron acceptor titanium dioxide (TiO2).
Titanium dioxide readily forms miniscule tubes just tens of nanometers across—10,000 times smaller than a human hair. Rows of tiny, uniform nanotubes sprout across a film of titanium that has been submerged in an electrochemical bath.
The next step required the researchers to fill the nanotubes with the organic polymer—a frustrating process.
"Filling nanotubes with polymer is like trying to stuff wet spaghetti into a table full of tiny holes," Darling said. "The polymer ends up bending and twisting, which leads to inefficiencies both because it traps pockets of air as it goes and because twisted polymers don’t conduct charges as well.
"In addition, this polymer doesn’t like titanium dioxide," Darling added. "So it pulls away from the interface whenever it can."
Trying to sidestep this problem, the team hit on the idea of growing the polymer directly inside the tubes. They filled the tubes with a polymer precursor, turned on ultraviolet light, and let the polymers grow within the tubes.
Grown this way, the polymer doesn’t shy away from the TiO2. In fact, tests suggest the two materials actually mingle at the molecular level; together they are able to capture light at wavelengths inaccessible to either of the two materials alone. This "homegrown" method is potentially much less expensive than the energy-intensive process that produces the silicon crystals used in today’s solar cells.
These devices dramatically outperform those fabricated by filling the nanotubes with pre-grown polymer, producing about 10 times more electricity from absorbed sunlight. The solar cells produced by this technique, however, do not currently harness as much of the available energy from sunlight as silicon cells can. Darling hopes that further experiments will improve the cells' efficiency.
The impact of airborne nitrogen released from the burning of fossil fuels and widespread use of fertilizers in agriculture is much greater than previously recognized and even extends to remote alpine lakes, according to a study published Nov. 6 in the journal Science.
Examining nitrogen deposition in alpine and subalpine lakes in Colorado, Sweden and Norway, James Elser, a limnologist in the School of Life Sciences at Arizona State University, and his colleagues found that, on average, nitrogen levels in lakes were elevated, even those isolated from urban and agricultural centers.
The article “Shifts in lake N:P stoichiometry and nutrient limitation driven by atmospheric nitrogen deposition” presents experimental data from more than 90 lakes. The researchers’ collaboration also revealed that nitrogen-rich air pollution has already altered the lakes’ fundamental ecology.
“This is because plant plankton or phytoplankton, like all plants, need nitrogen and phosphorus for growth,” Elser says. “Inputs from pollution in the atmosphere appear to shift the supplies of nitrogen relative to other elements, like phosphorus.”
The increase in the availability of nitrogen means that growing phytoplankton in lakes receiving elevated nitrogen deposition are now limited by how much phosphorus they can acquire. Elser says that this is important because “we know that phosphorus-limited phytoplankton are poor food – basically ‘junk food’ for animal plankton, which in turn are food for fish.”
“Such a shift could potentially affect biodiversity,” he adds. “However, we don’t really know because unlike in terrestrial systems, the impacts of nitrogen deposition on aquatic systems have not been widely studied.”
Elser’s collaborators include researchers Tom Andersen and Dag Hessen from the University of Oslo; Jill Baron of the United States Geological Survey and Natural Resource Ecology Laboratory at Colorado State University; Ann-Kristin Bergström and Mats Jansson with Umeå University, Sweden; and Koren Nydick of the Mountain Studies Institute in Colorado, in addition to Marcia Kyle and Laura Steger, who are members of his own group in ASU’s College of Liberal Arts and Sciences.
Hessen, a well-known limnologist, and Elser have had a long-standing collaborative relationship, looking not only at nitrogen deposition but also zooplankton nutrition and a broad range of stoichiometric studies. Elser met Bergström at a conference at Umeå University and discovered that she had performed similar experiments in Sweden.
“By combining these studies we were able to achieve a more global picture of how nitrogen was impacting a broad range of lakes and come to firmer conclusions about effects of deposition,” Elser says.
Elser and Hessen hope to expand on these findings and have a pending grant proposal with the Norwegian government. In addition, Elser says he hopes to perform similar studies in China “where atmospheric nitrogen pollution is extremely high,” but, as yet, unstudied.
Elser has built a career around asking questions about energy and material flows in ecosystems, and traveling all over the world to find answers. Understanding the balance of phosphorus, carbon and nitrogen in systems forms the backbone of Elser’s worldview, known as “stoichiometric theory.” His pioneering studies have jumpstarted new research approaches, insights into nutrient limitation, trophic dynamics, biogeochemical cycling, and linkages between evolutionary and ecosystem processes. This study was supported by the National Science Foundation.
(Photo: James Elser)
Arizona State University
Scientists have shown in the laboratory that metal nanoparticles damaged the DNA in cells on the other side of a cellular barrier. The nanoparticles did not cause the damage by passing through the barrier, but generated signalling molecules within the barrier cells that were then transmitted to cause damage in cells the other side of the barrier.
The research was carried out by a team at the University of Bristol and colleagues, and is published online this week in Nature Nanotechnology.
The team grew a layer of human cells (about 3 cells in thickness) in the lab. They then used this barrier to examine the indirect effects of cobalt-chromium nanoparticles on cells that were lying behind this barrier.
The amount of DNA damage seen in the cells behind the protective barrier was similar to the DNA damage caused by direct exposure of the cells to the nanoparticles.
Dr Patrick Case, senior author on the study, said: “We need to be clear that our experimental set up is not a model of the human body. The cells receiving the exposure were bathed in culture media, whilst in the body they might be separated from the barrier by connective tissue and blood vessels. The barrier cells were malignant cell line and 3 cells in thickness whilst all barriers in the body are less thick and of non malignant cells.”
Gevdeep Bhabra, lead author on the paper, said: “Even though this work was done in the laboratory, our results suggest the existence of a mechanism by which biological effects can be signalled through a cellular barrier, thus it gives us insights into how barriers in the body such as the skin, the placenta and the blood-brain barrier might work.”
Professor Ashley Blom, Head of Orthopaedic Surgery at the University of Bristol, added: “If barriers in the body do act in this way, then it gives us insight into how small particles such as metal debris or viruses may exert an influence in the body. It also highlights a potential mechanism whereby we might be able to deliver novel drug therapies in the future.”
These findings suggest that the indirect, as well as the direct, effects of nanoparticles on cells might be important when evaluating their safety.
(Photo: Bristol U.)
University of Bristol
The research, by Justin Lehmiller, assistant professor of Applied Social Psychology, is the first to look at the health issues surrounding secret relationships – information that could someday help the psychology profession with couples counseling.
The studies appeared in the November issue of Personality and Social Psychology Bulletin.
Lehmiller examined online responses of two different groups totaling more than 700 people. A large number of respondents indicated that they were keeping their relationships secret from other persons. Those relationships that were being kept secret ranged from interracial and same-sex partnerships to workplace romances. All participants were asked to report on their feelings about their relationship, as well as their personal physical and psychological health.
“We found people who keep relationships secret tend to be less committed – secrecy seems to limit how close you can get to a partner and whether they can become a central part of your life,” Lehmiller said.
“Such people also reported worse physical health outcomes and lower self-esteem. The data suggests that one of the reasons for this seems to be that keeping a relationship secret is stressful. It makes people nervous and anxious and scared. We suspect that when people chronically experience those negative emotions, that’s what undermines your health.”
People in secret relationships reported more frequent symptoms of poor health, such as headaches, loss of sexual interest/pleasure, low energy and poor appetite. They also had worse self-esteem, or more negative feelings about who they are as people.
Lehmiller cautioned that the studies only reveal general trends and should not be taken to mean that secret relationships are inherently bad. People with particularly strong social support networks may be less likely to suffer even if they keep their relationships secret, he said.
Therapists could potentially use the results to help individuals or couples seeking treatment, Lehmiller said: “We know that being in a secret relationship is challenging and may have negative effects on both the relationship and the partners’ health. This means therapists need to evaluate these situations carefully and ask, ‘Is it worthwhile to disclose the relationship?’ It’s possible that, for some, disclosure might improve the health of the individuals as well as their partnerships because it reduces stress and burden.”
Lehmiller was most surprised by the variety of secret relationships that people disclosed in the surveys. He plans to conduct long-term follow-up studies with the same individuals to see how they’re coping with their relationships and whether the effects of secrecy change over time.
(Photo: Colorado SU)
Colorado State University
It has long been known that the so-called p53 gene suppresses tumors -- when it mutates, cancer cells take hold and multiply. New research at Cornell's College of Veterinary Medicine, however, shows that inhibiting a second gene (Hus1) is lethal to cells with p53 mutations, knowledge that has scientists investigating whether the same combination may kill cancerous cells.
Using a mouse model, senior author Robert Weiss, associate professor of molecular genetics, first author and graduate student Stephanie Yazinski and colleagues explored how cells respond when both genes are inhibited. When they inactivated the Hus1 gene in healthy mammary gland tissues, the researchers report, it caused genome damage and cell death. And when they studied the effects of Hus1 inactivation in p53-deficient cells, which are highly resistant to cell death, they discovered that the ability of Hus1 inactivation to kill cells was even greater.
The study is published in the Nov. 9 issue of the Proceedings of the National Academy of Sciences.
"Our work contributes to an important new understanding of cancer cells and their weaknesses," Weiss said. "The mutations that allow cancer cells to divide uncontrollably also make the cancer cells more dependent on certain cellular processes. We were able to exploit one such dependency of p53-deficient cells and could efficiently kill these cells by inhibiting Hus1."
Weiss and his team have new experiments under way.
"We've proven the power of inhibiting both pathways in normal tissue," said Weiss. "Now we want to extend our knowledge to cancerous tissue and determine if the loss of Hus1 will impact the ability of cancers with p53 mutations to take hold and grow."
(Photo: Cornell U.)