Friday, June 11, 2010

STUDY FINDS POKER PLAYERS USING DRUGS TO ENHANCE PERFORMANCE

0 comentarios
A Nova Southeastern University study recently presented at a national conference found that 80 percent of poker players around the world reported using drugs and other substances to enhance their performance in poker.

Poker players are using drugs such as marijuana, cocaine, amphetamines, Valium, and other prescription medications, as well as substances including caffeine, energy drinks and guarana to get an edge over their opponents.

"The use these substances could allow poker players to stay awake longer, as well as focus and concentrate better, which would be a competitive advantage," said Kevin Clauson, Pharm.D., an associate professor at NSU's College of Pharmacy, who was the principle investigator in the study. "Stamina is important for any poker player because tournament poker and cash games can go on for many hours."

Using these substances can be harmful for poker players, Clauson said. Depending on the type of substance, he pointed out; there will likely be short-term and long-term side effects.

The NSU researchers initially interviewed players in Las Vegas during the World Series of Poker and then surveyed players online from across the globe, including North America, Europe, and Asia, with the majority of respondents coming from the US and Canada.

Respondents included professional poker players, semi-pro, amateur, and recreational players. Regardless of one's status, an overwhelming majority of poker involves some amount of money, Clauson said. The players surveyed played poker --- largely no-limit Texas hold' em ---- both in person and on the Internet. Most were males in their mid-20s.

About 73 percent of the respondents said they used drugs and other substances to focus and concentrate better. The rest used these products to calm their nerves, stay awake, and improve memory.

The results suggest that the use of substances to improve poker performance is widespread, especially at higher stakes, Clauson said. "Most people we surveyed are using some kind of a boost in order to play one of the most popular games in the world," he said.

Nova Southeastern University

'LITTLE BROWN BALLS' TIE MALARIA AND ALGAE TO COMMON ANCESTOR

0 comentarios

Inconspicuous "little brown balls" in the ocean have helped settle a long-standing debate about the origin of malaria and the algae responsible for toxic red tides, according to a new study by University of British Columbia researchers.

In an article published in the Proceedings of the National Academy of Sciences Early Edition, UBC Botany Prof. Patrick Keeling describes the genome of Chromera and its role in definitively linking the evolutionary histories of malaria and dinoflalgellate algae.

"Under the microscope, Chromera looks like boring little brown balls," says Keeling. "In fact, the ocean is full of little brown and green balls and they're often overlooked in favour of more glamorous organisms, but this one has proved to be more interesting than its flashier cousins."

First described in the journal Nature in 2008, Chromera is found as a symbiont inside corals. Although it has a compartment – called a plastid – that carries out photosynthesis like other algae and plants, Chromera is closely related to apicomplexan parasites – including malaria. This discovery raised the possibility that Chromera may be a "missing link" between the two.

Now Keeling, along with PhD candidate Jan Janouskovec, postdoctoral fellow Ales Horak and collaborators from the Czech Republic, has sequenced the plastid genome of Chromera and found features that were passed down to both apicomplexan and dinoflagellate plastids, linking the two lineages.

"These tiny organisms have a huge impact on humanity in very different ways," says Keeling. "The tool used by dinoflagellates and Chromera to do good – symbiosis with corals – at some point became an infection mechanism for apicomplexans like malaria to infect healthy cells.

"Resolving their evolutionary origins not only settles a long-standing scientific debate but could ultimately provide crucial information for tackling diseases and environmental concerns."

(Photo: Patrick Keeling)

University of British Columbia

WARMER CLIMATE MAKES BALTIC MORE SALTY

0 comentarios

Science has long believed that a warmer climate will increase river runoff to the Baltic Sea, thus making the inland sea less salty. However, a new extensive study from the University of Gothenburg reveals that the effect will probably be the opposite: climate change will reduce river runoff and increase salinity in the Baltic Sea.

"There could be major consequences for the Baltic's sensitive ecosystem," says researcher Daniel Hansson.

The Baltic is a young, brackish sea with a unique and sensitive ecosystem containing both marine and freshwater species. Researchers have been warning for many years that tiny changes in the salt content could have a major impact on the ecosystem. The basis for the argument has been that a warmer climate will increase river runoff and make the Baltic Sea less salty.

University of Gothenburg researchers, who have analysed 500 years' worth of climate data, now say that the effect could instead be the opposite.

Researchers at the Department of Geosciences have been able to reconstruct the flow of freshwater to the Baltic Sea since the 16th century by analysing atmospheric observations from the past 500 years. The study, which has been published in the International Journal of Climatology, shows that in the past, warm periods have coincided with less freshwater in the Baltic Sea. If the climate becomes warmer in future, river runoff may also fall, leading to an increase in salinity.

However, there may be major regional differences:

"More freshwater runs off in the northern Baltic and Gulf of Finland when it's warmer, while the opposite occurs in the southern Baltic. The reason for this is that a warmer climate leads to increased rainfall in the north and east and less rainfall in the south. Our study shows that the decrease in the south is greater than the increase in the north, which means that overall the water will be saltier," says Daniel Hansson, researcher at the Department of Geosciences.

The increase in salinity in the Baltic Sea may have a major impact on the sensitive ecosystem, which is dependent on a delicate balance between salt and fresh water.

"A saltier sea will benefit certain animal and plant species while being problematic for others, which could upset the entire ecosystem," says Daniel Hansson, who emphasises that there is still a considerable degree of uncertainty:

"We've studied changes over the past 500 years, which is not the same thing as predicting what will happen over the next 500 years. But there is reason to believe that warm periods in the future will behave similarly to the way they have done in the past," says Daniel Hansson.

(Photo: U. Gothenburg)

University of Gothenburg

THE DEEP VOICE OF ALPHA MALE

0 comentarios
Men with a deep, masculine voice are seen as more dominant by other men but a man’s own dominance – perceived or actual – does not affect how attentive he is to his rivals’ voices. His own dominance does however influence how he rates his competitors’ dominance: the more dominant he thinks he is, the less dominant he rates his rival’s voice.

These findings by Sarah Wolff and David Puts, from the Department of Anthropology at Pennsylvania State University in the US, are published online in Springer’s journal Behavioral Ecology and Sociobiology.

This is the first study to look at why men differ in the way they perceive indicators of dominance in others and what causes the variation in the degree to which a man’s masculinity affects judgments of his dominance. Specifically, the authors investigated for the first time whether men’s own dominance affects their attentiveness to vocal masculinity, a dominance signal, when they assess their competitors. They carried out two studies asking men to rate male vocal recordings which differed in level of masculinity i.e. from low, more masculine voices to higher, less masculine voices.

The first study looked at how participating men rated others’ dominance in relation to their self-rated physical dominance in a dating game scenario, based on their competitor’s voice recordings. As predicted, more masculine voices were perceived as more dominant. On the whole, men who rated themselves higher in fighting ability, i.e. more dominant, rated other men lower on dominance and reported more sexual partners in the past year. However, men’s self-rated physical dominance was not linked to how attentive they were to vocal masculinity when assessing other men’s dominance.

The second study examined how objective measures of men’s physical dominance including size, strength, testosterone levels and physical aggressiveness influenced dominance ratings. Of these, only testosterone had an effect. Men with either high or low levels perceived other men as more dominant, based on their voice recordings, whereas men with intermediate testosterone levels rated other men lower in dominance.

The authors conclude: “Our findings show that vocal masculinity has large effects on the appearance of dominance that are not modulated by the dominance of the perceiver. Variables related to a man’s own dominance predict his assessments of other men’s dominance, even though they do not predict his attentiveness to vocal masculinity when making these assessments. Future research should examine whether dominance influences assessment of other potential dominance cues, such as facial hair, facial masculinity, muscularity, and stature.”

Springer

UBC RESEARCHER DECODES REMBRANDT'S 'MAGIC'

0 comentarios

A University of British Columbia researcher has uncovered what makes Rembrandt's masterful portraits so appealing.

In the study, published in the current issue of the Massachusetts Institute of Technology's arts and sciences journal Leonardo, UBC researcher Steve DiPaola argues that Rembrandt may have pioneered a technique that guides the viewer's gaze around a portrait, creating a special narrative and "calmer" viewing experience.

Renaissance artists used various techniques to engage viewers, many incorporating new scientific knowledge on lighting, spatial layout and perspectives. To isolate and pinpoint factors that contribute to the "magic" of Rembrandt's portraits, DiPaola used computer-rendering programs to recreate four of the artist's most famous portraits from photographs of himself and other models. Replicating Rembrandt's techniques, he placed a sharper focus on specific areas of the model's face, such as the eyes.

Working with a team from the Vision Lab in UBC's Dept. of Psychology, DiPaola then tracked the viewers' eye movements while they examined the original photographs and the Rembrandt-like portraits.

"When viewing the Rembrandt-like portraits, viewers fixated on the detailed eye faster and stayed there for longer periods of time, resulting in calmer eye movements," says DiPaola, who is also an associate professor at Simon Fraser University and adjunct faculty at UBC's Media and Graphics Interdisciplinary Centre. "The transition from sharp to blurry edges, known as 'lost and found edges,' also directed the viewers’ eyes around the portrait in a sort of narrative."

The study is the first to scientifically verify the impact of these "eye guiding" techniques on viewers and to attribute its origin to Rembrandt.

The viewers also preferred portraits with this "eye guiding narrative" to the original photographs with uniform details across the tableau.

"Through these techniques, Rembrandt is essentially playing tour guide to his viewers hundreds of years after his death, creating a unique narrative by guiding the viewers' eye," says DiPaola. "This may explain why people appreciate portraiture as an art form.

"Whether he observed how his own eyes behaved while viewing a painting or if he did it by intuition, Rembrandt incorporated an understanding of how the human eye works that has since been proven accurate," says DiPaola.

(Photo: Steve DiPaola)

University of British Columbia

A STONE SAYS MORE THAN A THOUSAND RUNES

0 comentarios
It was not necessary to be literate to be able to access rune carvings in the 11th century. At the same time those who could read were able to glean much more information from a rune stone than merely what was written in runes. This is shown in new research from Uppsala University in Sweden.

Rune stones are an important part of the Swedish cultural environment. Many of them are still standing in their original places and still bear witness about the inhabitants of the area from a thousand years ago. They thereby represent a unique source of knowledge about the Viking Age, providing us with glimpses of a period we otherwise would have known very little about. Among other themes, they tell us about family relations, travels, or matters of faith, and all of it in a language that scholars can understand fairly readily.

"The language and factual information of runic inscriptions are fairly well researched, but we know little about how Viking Age people read a rune stone," says Marco Bianchi at the Department of Scandinavian Languages, whose dissertation investigates Viking Age written culture in the provinces of Uppland and Södermanland.

There are a number of inscriptions with runes that do not convey any linguistic meaning. In Uppland they are found both in areas that are rich in rune stones and in those that have very few.

"But the fewer rune stones there were in the vicinity, the poorer writers the carvers of these non-verbal inscriptions were. What was important was thus not to convey a linguistic message, but to create a rune carving that was perceived by the local people as credible," claims Marco Bianchi.

However, rune stones entirely lacking in linguistic content are rather rare. On most rune stones you can read a little narrative in the form of a memorial inscription that often winds back and forth across a large stone surface. At first glance these runic inscriptions seem chaotic, but they are in fact very well structured. Usually they are meant to be read starting in the lower left-hand corner. Another observation Marco Bianchi makes is that many rune stones do not have any given reading order. Different parts of the inscription are in such cases visually separated from each other and can be read in any order the reader wishes.

"You can compare a rune stone text with a newspaper spread or a Web page, where the reader is attracted by headings and pictures," says Marco Bianchi.

The visual design not only structures the linguistic message but complements and nuances it as well.

"On many rune stones the interplay between ornamentation and the runes is striking. To people of the Viking Age, the actual runes were only part of the message of the rune stone," he says.

Uppsala University

"OUT OF WHACK" PLANETARY SYSTEM OFFERS CLUES TO A DISTURBING PAST

0 comentarios

The discovery of a planetary system "out of whack," where the orbits of two planets are at a steep angle to each other, was reported (May 24) by a team of astronomers led by Barbara McArthur of The University of Texas at Austin McDonald Observatory.

This surprising finding will affect theories of how multi-planet systems evolve and shows that some violent events can happen to disrupt planets' orbits after a planetary system forms, say researchers.

"The findings mean that future studies of exoplanetary systems will be more complicated. Astronomers can no longer assume all planets orbit their parent star in a single plane," McArthur says.

McArthur and her team used data from Hubble Space Telescope (HST), the giant Hobby-Eberly Telescope, and other ground-based telescopes combined with extensive modeling to unearth a landslide of information about the planetary system surrounding the nearby star Upsilon Andromedae.

McArthur reported these findings in a press conference at the 216th meeting of the American Astronomical Society in Miami, along with her collaborator Fritz Benedict, also of McDonald Observatory, and team member Rory Barnes of the University of Washington. The work also will be published in the June 1 edition of the Astrophysical Journal.

For just over a decade, astronomers have known that three Jupiter-type planets orbit the yellow-white dwarf star Upsilon Andromedae. Similar to our Sun, Upsilon Andromedae lies about 44 light-years away. It's a bit younger, a bit more massive, and a bit brighter than the Sun.

Combining fundamentally different, yet complementary, types of data from HST and ground-based telescopes, McArthur's team has determined the exact masses of two of the three known planets, Ups And c and d. Much more startling, though, is their finding that not all planets orbit this star in the same plane. The orbits of planets c and d are inclined by 30 degrees with respect to each other. This research marks the first time that the "mutual inclination" of two planets orbiting another star has been measured. And, the team has uncovered hints that a fourth planet, e, orbits the star much farther out.

"Most probably Upsilon Andromedae had the same formation process as our own solar system, although there could have been differences in the late formation that seeded this divergent evolution," McArthur said. "The premise of planetary evolution so far has been that planetary systems form in the disk and remain relatively co-planar, like our own system, but now we have measured a significant angle between these planets that indicates this isn't always the case."

Until now the conventional wisdom has been that a big cloud of gas collapses down to form a star, and planets are a natural byproduct. Left over material forms a disk. In our solar system, there's a fossil of that creation event because all of the eight major planets orbit in nearly the same plane.

Several different gravitational scenarios could be responsible for the surprisingly inclined orbits in Upsilon Andromedae.

"Possibilities include interactions occurring from the inward migration of planets, the ejection of other planets from the system through planet-planet scattering, or disruption from the parent star's binary companion star, Upsilon Andromedae B," McArthur said.

Barnes, an expert in the dynamics of extrasolar planetary systems added, "Our dynamical analysis shows that the inclined orbits probably resulted from the ejection of an original member of the planetary system. However, we don't know if the distant stellar companion forced that ejection, or if the planetary system itself formed such that some original planets were ejected. Furthermore, we find the revised configuration still lies right on the precipice of stability: The planets pull on each other so strongly that they are almost able to throw each other out of the system."

The two different types of data combined in this research were "astrometry" from Hubble Space Telescope and "radial velocity" from ground-based telescopes.

Astrometry is the measurement of the positions and motions of celestial bodies. McArthur's group used one of the Fine Guidance Sensors (FGS) on Hubble Space Telescope for the task. The FGS are so precise that they can measure the width of a quarter in Denver from the vantage point of Miami. It was this precision that was used to trace the star's motion on sky caused by its surrounding — and unseen — planets.

Radial velocity makes measurements of the star's motion on the sky toward and away from Earth. These measurements were made over 14 years using ground-based telescopes, including two at McDonald Observatory and others at Lick, Haute-Provence and Whipple Observatories. The radial velocity provides a long baseline of foundation observations, which enabled the shorter duration, but more precise and complete, HST observations to better define the orbital motions.

The fact that the team determined the orbital inclinations of planets c and d allowed them to calculate the exact masses of the two planets. The new information changed which planet is heavier. Previous minimum masses for the planets given by radial velocity studies put the minimum mass for planet c at 2 Jupiters and for planet d at 4 Jupiters. The new, exact, masses found by astrometry are 14 Jupiters for planet c and 10 Jupiters for planet d.

"The HST data show radial velocity isn't the whole story," Benedict said. "The fact that the planets actually flipped in mass was really cute."

The 14 years of radial velocity information compiled by the team uncovered hints that a fourth, long-period planet may orbit beyond the three now known. There are only hints about that planet because it's so far out, the signal it creates does not yet reveal the curvature of an orbit. Another missing piece of the puzzle is the inclination of the innermost planet b, which would require precision astrometry 1,000 times greater than Hubble's, a goal NASA's planned Space Interferometry Mission (SIM) could attain.

The team's Hubble data also confirmed Upsilon Andromedae's status as a binary star. The companion star is a red dwarf less massive and much dimmer than the Sun.

"We don't have any idea what its orbit is," Benedict said. "It could be very eccentric. Maybe it comes in very close every once in a while. It may take 10,000 years."

Such a close pass by the primary star could gravitationally perturb the orbits of its planets.

(Photo: NASA/ESA/A. Feild/STScI)

University of Texas

SUPERCONDUCTIVITY BREAKTHROUGH COULD LEAD TO MORE COST EFFECTIVE TECHNOLOGIES

0 comentarios

Researchers from the University of Liverpool and Durham University have fitted another piece into the superconductivity puzzle that could help in the quest to bring down the cost of technologies such as MRI scanners and some energy storage applications that rely on superconductors. The result is published in the Nature online journal (19th May 2010).

Using the ISIS and Diamond facilities at STFC's Rutherford Appleton Laboratory (RAL) and the European Synchrotron Radiation Facility (link opens in a new window) (ESRF) in Grenoble, scientists have demonstrated how a new material made from metal atoms and buckyballs (tiny carbon-60 molecules shaped like a football) becomes a high temperature superconductor when it is squashed. The applied pressure shrinks the structure and overcomes the repulsion between the electrons, allowing them to pair up and travel through the material without resistance.

The Liverpool and Durham researchers made the new material supported by funding from the Engineering and Physical Sciences Research Council (link opens in a new window) (EPSRC) for a program investigating ways of creating higher temperature superconductors, to reduce some of the costs involved with keeping them at their optimum temperature and broaden their applications. An MRI scanner for example, contains person-sized superconductive magnet that needs to be kept inside a bath of liquid helium in order to regulate the superconductor’s temperature at - 270˚C. The ultimate aim is for a superconductor to operate at room temperature to eliminate the need for large and expensive cooling systems.

Dr Peter Baker, Muon instrument scientist at STFC's ISIS Facility: “This research suggests that there is a universal trend in high temperature superconducting materials, which is a great step forward in understanding the fundamental nature of superconductivity. Once we know how superconductivity works it will be easier to develop high temperature superconducting materials with specific properties, opening the door to new applications and ultra efficient energy transmission.”

The advantage of investigating carbon-based superconducting materials is that they can be made with different structures that alter their properties; whereas the active components of other high high-temperature superconductors, such as copper oxide materials, are always arranged in one way. This structural flexibility offers a new way of looking at the mechanisms that drive high-temperature superconductivity, offering more insight into how to make higher temperature superconductors. It has also established a universal pattern in the superconductivity of carbon-based materials which can now be used to help guide future theoretical models of superconductivity.

Matthew Rosseinsky, Professor of Inorganic Chemistry, University of Liverpool said; “We’ve shown for the first time how controlling the arrangement of molecules in a high temperature superconductor controls its properties. This is possible because we have found two arrangements of the same basic molecular unit which have both superconducting and magnetic properties.”

Kosmas Prassides, Professor of Chemistry, Durham University, said; “This is important in the context of high-temperature superconductivity as it allowed us to see at which point superconductivity emerges out of the competing insulating state irrespective of the exact atomic structure - something that has not been possible before for any other known material”.

(Photo: STFC)

Science and Technology Facilities Council

MACHINES THAT LEARN BETTER

0 comentarios

In the last 20 years or so, many of the key advances in artificial-intelligence research have come courtesy of machine learning, in which computers learn how to make predictions by looking for patterns in large collections of training data. A new approach called probabilistic programming makes it much easier to build machine-learning systems, but it’s useful for a relatively narrow set of problems. Now, MIT researchers have discovered how to extend the approach to a much larger class of problems, with implications for subjects as diverse as cognitive science, financial analysis and epidemiology.

Historically, building a machine-learning system capable of learning a new task would take a graduate student somewhere between a few weeks and several months, says Daniel Roy, a PhD student in the Department of Electrical Engineering and Computer Science who along with Cameron Freer, an instructor in pure mathematics, led the new research. A handful of new, experimental, probabilistic programming languages — one of which, Church, was developed at MIT — promise to cut that time down to a matter of hours.

At the heart of each of these new languages is a so-called inference algorithm, which instructs a machine-learning system how to draw conclusions from the data it’s presented. The generality of the inference algorithm is what confers the languages’ power: The same algorithm has to be able to guide a system that’s learning how to recognize objects in digital images, or filter spam, or recommend DVDs based on past rentals, or whatever else an artificial-intelligence program may be called upon to do.

The inference algorithms currently used in probabilistic programming are great at handling discrete data but struggle with continuous data. For an idea of what that distinction means, consider three people of different heights. Their rank ordering, from tallest to shortest, is discrete: Each of them must be first, second, or third on the list. But their absolute heights are continuous. If the tallest person is 5 feet 10 inches tall, and the shortest is 5 feet 8 inches, you can’t conclude that the third person is 5 feet 9 inches: He or she could be 5 feet 8.5 inches, or 5 feet 9.6302 inches or an infinite number of other possibilities.

Designers of probabilistic programming languages are thus avidly interested in whether it’s possible to design a general-purpose inference algorithm that can handle continuous data. Unfortunately, the answer appears to be no: In a yet-unpublished paper, Freer, Roy, and Nate Ackerman of the University of California, Berkeley, mathematically demonstrate that there are certain types of statistical problems involving continuous data that no general-purpose algorithm could solve.

But there’s good news as well: Recently, at the International Conference on Artificial Intelligence and Statistics, Roy presented a paper in which he and Freer not only demonstrate that there are large classes of problems involving continuous data that are susceptible to a general solution but also describe an inference algorithm that can handle them. A probabilistic programming language that implemented the algorithm would enable the rapid development of a much larger variety of machine-learning systems. It would, for instance, enable systems to better employ an analytic tool called the Pólya tree, which has been used to model stock prices, disease outbreaks, medical diagnoses, census data, and weather systems, among other things.

“The field of probabilistic programming is fairly new, and people have started coming up with probabilistic programs, but Dan and Cameron are really filling the theoretical gaps,” says Zoubin Ghahramani, professor of information engineering at the University of Cambridge. The hope, Ghahramani says, “is that their theoretical underpinnings will make the effort to come up with probabilistic programming languages much more solidly grounded.”

Chung-chieh Shan, a computer scientist at Rutgers who specializes in models of linguistic behavior, says that the MIT researchers’ work could be especially useful for artificial-intelligence systems whose future behavior is dependent on their past behavior. For instance, a system designed to understand spoken language might have to determine words’ parts of speech. If, in some context, it notices that a word tends to be used in an uncommon way — for instance, “man” is frequently used as a verb instead of a noun — then, going forward, it should have greater confidence in assigning that word its unusual interpretation.

Often, Shan explains, treating problems as having such “serial dependency” makes them easier to describe. But it also makes their solutions harder to calculate, because it requires keeping track of an ever-growing catalogue of past behaviors and revising future behaviors accordingly. Freer and Roy’s algorithm, he says, provides a way to convert problems that have serial dependency into problems that don’t, which makes them easier to solve. “A lot of models would call for this kind of picture,” Shan says. Roy and Freer’s work “is narrowing this gap between the intuitive description and the efficient implementation.”

While Freer and Roy’s algorithm is guaranteed to provide an answer to a range of previously intractable problems, Shan says, “there’s a difference between coming up with the right algorithm and implementing it so that it runs fast enough on an actual computer.” Roy and Freer agree, however, which is why they haven’t yet incorporated their algorithm into Church. “It’s fairly clear that within the set of models that our algorithm can handle, there are some that could be arbitrarily slow,” Roy says. “So now we have to study additional structure. We know that it’s possible. But when is it efficient?”

(Photo: Jason Dorfman/CSAIL)

MIT

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com