Wednesday, October 13, 2010

LESS PAIN FOR LEARNING GAIN

0 comentarios
Scientists long have recognized that many perceptual skills important for language comprehension and reading can be enhanced through practice. Now research from Northwestern University suggests a new way of training that could reduce by at least half the effort previously thought necessary to make learning gains.

The research also may be the first behavioral demonstration of metaplasticity -- the idea that experiences that on their own do not generate learning can influence how effective later experiences are at generating learning.

“Prior to our work much of the research into perceptual learning could be summed up as ‘no pain, no gain,’” says Beverly Wright, first author of a study in the Sept. 22 Journal of Neuroscience and professor of communication sciences and disorders at Northwestern. “Our work suggests that you can have the same gain in learning with substantially less pain.”

The findings could lead to less effortful therapies for children who suffer from language learning impairments involving perceptual skills. And they hold potential for members of the general population with an interest in enhancing perceptual abilities -- for musicians seeking to sharpen their sensitivity to sound, people studying a second language or physicians learning to tell the difference between regular and irregular heartbeats.

Previous research showed that individuals become better at many perceptual tasks by performing them again and again, typically making the training tedious and long in length. It also showed that mere exposure to the perceptual stimuli used during practice on these tasks does not generate learning.

But the Northwestern researchers found that robust learning occurred when they combined periods of practice that alone were too brief to cause learning with periods of mere exposure to perceptual stimuli. “To our surprise, we found that two ‘wrongs’ actually can make a right when it comes to perceptual learning,” says Wright.

What’s more, they found that the combination led to perceptual learning gains that were equal to the learning gains made by participants who performed twice as much continuous task training (training which by nature of its repetition and length often is onerous).

“It’s as though once you get your system revved up by practicing a particular skill, the brain acts as though you are still engaged in the task when you are not and learning still takes place,” says Wright, who teaches in Northwestern’s School of Communication.

Wright and Northwestern researchers Andrew Sabin, Yuxuan Zhang, Nicole Marrone and Matthew Fitzgerald worked with four groups of adult participants aged 18 to 30 years with normal hearing and no previous experience with psychoacoustic tasks. Their goal was to improve participants’ ability to discriminate between the pitches of different tones.

The researchers initially determined the smallest difference in pitch that participants could discriminate from a 1,000 Hertz standard tone. They then divided the participants into four groups, each of which went through a different training regimen.

Participants in one group were trained for 20 minutes per day for a week on the pitch-discrimination task. Over and over again, they were asked to tell the difference between the 1,000 Hertz tone and a lower tone but showed no improvement.

Of greatest importance for the study, participants in a second group showed significant learning gains when the same amount of target task training (20 minutes) was combined with 20 minutes of work on an unrelated puzzle while repeatedly presenting a 1,000 Hertz tone through headphones.

Impressively, the learning of the second group also was comparable to that of a third group that for a week practiced the pitch-discrimination target task for 40 minutes per day.

A fourth group of participants repeatedly exposed to a 1,000 Hertz tone for 40 minutes per day while performing an unrelated task showed no learning gains.

Further experiments revealed that the order of presentation -- whether the 20 minutes of target task training occurred before or after the 20 minutes of the related task -- did not affect learning. Each scenario yielded equal pitch discrimination learning gains.

In addition, the researchers discovered that the effectiveness of the combination of the target task training and of the unrelated training plus stimuli presentation began declining if the two tasks were separated by more than 15 minutes. Pitch discrimination learning -- or evidence of metaplasticity -- disappeared completely if the sessions were separated by four hours.

Northwestern University

LET YOUR FINGERS DO THE DRIVING

0 comentarios

If drivers are yakking on cell phones and don't hear spoken instructions to turn left or right from a passenger or navigation system, they still can get directions from devices that are mounted on the steering wheel and pull skin on the driver's index fingertips left or right, a University of Utah study found.

The researchers say they don't want their results to encourage dangerous and distracted driving by cell phone users. Instead, they hope the study will point to new touch-based directional devices to help motorists and hearing-impaired people drive more safely. The same technology also could help blind pedestrians with a cane that provides directional cues to the person's thumb.

"It has the potential of being a safer way of doing what's already being done - delivering information that people are already getting with in-car GPS navigation systems," says the study's lead author, William Provancher, an assistant professor of mechanical engineering at the University of Utah.

In addition, Provancher says he is "starting to meet with the Utah Division of Services for the Blind and Visually Impaired to better understand how our technology could help those with vision impairments. It could be used in a walking cane for the blind," with a moving button on the handle providing tactile navigation cues to help the person walk to the corner market, for example.

The system also could help hearing-impaired people get navigation information through their fingertips if they cannot hear a system's computerized voice, says University of Utah psychology Ph.D. student Nate Medeiros-Ward, the study's first author. "We are not saying people should drive and talk on a cell phone and that tactile [touch] navigation cues will keep you out of trouble."

The study "doesn't mean it's safe to drive and talk on the cell phone," says co-author David Strayer, a professor of psychology at the University of Utah. "It was a test to show that even in situations where you are distracted by a cell phone, we can still communicate directional information to the driver via the fingertips even though they are 'blind' to everything else."

Provancher, Medeiros-Ward and Strayer conducted the study with Joel Cooper, who earned his psychology Ph.D. at the University of Utah and now works in Texas, and Andrew Doxon, a Utah doctoral student in mechanical engineering. The research was funded by the National Science Foundation and the University of Utah.

Provancher says the study was based on a "multiple resource model" of how people process information, in which resources are senses such as vision, hearing and touch that provide information to the brain.

"You can only process so much," he says. "The theory is that if you provide information through different channels, you can provide more total information. Our sense of touch is currently an unexplored means of communication in the car."

But does humanity really need yet another way to provide information to drivers who already are blabbering on cell phones, texting, changing CDs or radio stations, looking at or listening to navigation devices and screaming kids - not to mention trying to watch and listen to road conditions?

"The point is, it will help everybody," Provancher says. "We all have visual and audio distractions when driving. Having the steering wheel communicate with you through your fingertips provides more reliable navigation information to the driver."

Provancher says motorists already get some feedback through touch: vibration from missing a gear while shifting or a shimmying steering wheel due to tire problems.

"You can't look at two things at the same time," says Strayer. "You can't look at graphic display of where you should go and look out the windshield. It [touch-based information] is a nicer way to communicate with the driver without interfering with the basic information they typically need to drive safely. They need to look out the window to drive safely. They need to listen to the noise of traffic - sirens, horns and other vehicles. This tactile device provides information to the driver without taking their attention away from seeing and hearing information they need to be a safe driver."

The new study says automakers already use some tactile systems to warn of lane departures by drowsy drivers and monitor blind spots. But these devices generally twist the steering wheel (assisted steering), rather than simply prompting the driver to do so.

The study was conducted on a driving simulator that Strayer has used to demonstrate the hazards of driving while talking or texting on a cell phone. Two of Provancher's devices to convey information by touch were attached to the simulator's steering wheel so one came in contact with the index finger on each of the driver's hands.

During driving, each index fingertip rested on a red TrackPoint cap from an IBM ThinkPad computer - those little things that look like the eraser on the end of a pencil. When the drivers were supposed to turn left, the two touch devices gently stretched the skin of the fingertips to the left (counter clockwise); when a right turn was directed, the TrackPoint tugged the skin of the fingertips to the right (clockwise).

Nineteen University of Utah undergraduate students - six women and 13 men - participated in the study by driving the simulator. The screens that surround the driver's seat on three sides displayed a scene in which the driver was in the center lane of three straight freeway lanes, with no other traffic.

Four driving scenarios were used, each lasting six minutes and including, in random order, 12 cues to the driver to move to the right lane and 12 more to move left.

In two scenarios, the simulator drivers did not talk on cell phones and received direction instructions either from the simulator's computer voice or via the fingertip devices on the steering wheel. In the two other scenarios, the drivers talked on cell phones with a person in the laboratory and also received direction instructions, either from the computer voice or from the touch devices on the steering wheel.

Each participant did all four of the scenarios. The results:

* In the two scenarios without cell phones, the drivers' accuracy in correctly moving left or right was nearly identical for those who received tactile directions through their fingertips (97.2 percent) or by computerized voice (97.6 percent).

* That changed when the drivers talked on cell phones while operating the simulator. When drivers received fingertip navigation directions while talking, they were accurate 98 percent of the time, but when they received audio cues to turn right or left while talking on a cell phone, they changed lanes correctly only 74 percent of the time.

Strayer says the findings shouldn't be used to encourage cell phone use while driving because even if giving drivers directional information by touch works, "it's not going to help you with the other things you need to do while driving - watching out for pedestrians, noticing traffic lights, all the things you need to pay attention to."

Provancher has patents and wants to commercialize his tactile feedback devices for steering wheels and other potential uses.

"If we were approached by an interested automaker, it could be in their production cars in three to five years," he says, noting he already has had preliminary talks with three automakers and a European original equipment manufacturer.

In addition to possible devices for the vision- and hearing-impaired, Provancher says the technology could be used in a handheld device to let people feel fingertip-stretch pulses - rather than hear clicks - as they scroll through an iPod music playlist. He also says it might be used as a new way to interact with an MP3 music player in a vehicle, or to control games.

Provancher set the stage for the tactile navigation devices in two research papers this year in the journal Transactions on Haptics, published by the Institute of Electrical and Electronics Engineers. Haptics is to the sense of touch what optics is to vision.

In one of those studies, Provancher tested a haptic device that stretched the fingertip skin in four horizontal directions (right, left, front, back) and found that relatively faster and larger (one twenty-fifth of an inch) movements conveyed direction information most accurately.

In that study, Provancher also mentioned other possible uses for such devices, including allowing command centers to direct emergency responders and urban soldiers to incident locations, or directing air traffic controllers' attention to important information on a computer screen.

(Photo: Nate Madeiros-Ward, the University of Utah)

University of Utah

BREAKTHROUGH IN QUANTUM COMPUTING

0 comentarios

A team led by UNSW engineers and physicists has achieved a breakthrough that brings a super-fast quantum computer a step closer to reality by developing a "single electron reader" – a key building block in creating a silicon-based quantum machine.

Quantum computers promise exponential increases in processing speed over today’s computers through their use of the "spin", or magnetic orientation, of individual electrons to represent data in their calculations.

In order to employ electron spin, the quantum computer needs both a way of changing the spin state (the "write" function) and of measuring that change (the "read" function) to form a qubit – the equivalent of the bits in a conventional computer.

In creating the single electron reader, the team led by Dr Andrea Morello and Professor Andrew Dzurak, of the School of Electrical Engineering and Telecommunications at UNSW, has made possible the measurement of the spin of one electron in silicon in a single shot experiment. The team also includes researchers from the University of Melbourne and Aalto University in Finland.

"Our device detects the spin state of a single electron in a single phosphorus atom implanted in a block of silicon. The spin state of the electron controls the flow of electrons in a nearby circuit," said Dr Morello, the lead author of the paper, Single-shot readout of an electron spin in silicon, which has been published in the journal Nature.

"Until this experiment, no-one had actually measured the spin of a single electron in silicon in a single-shot experiment."

Professor Dzurak said quantum computers will be able to perform certain tasks much faster than normal computers.

"These are tasks such as searching databases, modelling complex molecules or developing new drugs. They could also crack most modern forms of encryption," he said.

"After a decade of work trying to build this type of single atom qubit device, this is a very special moment."

(Photo: UNSW)

University of New South Wales

Followers

Archive

 

Selected Science News. Copyright 2008 All Rights Reserved Revolution Two Church theme by Brian Gardner Converted into Blogger Template by Bloganol dot com