Hey there, time traveller!
This article was published 28/6/2013 (1332 days ago), so information in it may no longer be current.
LOS ANGELES--A quadriplegic has used thought to make a robotic hand feed her chocolate. A monkey moved a computer cursor using brain waves. But how the brain "learns" to control something without sending the signal through a spine and nerves remains a mystery.
It turns out we learn to move a robotic arm or computer cursor with the same neurons we use to learn to ride a bicycle or catch a ball. On a neurobiological level, that deceptively simple truth could have a profound effect on how future devices could help those who have suffered a stroke or paralysis.
Researchers at the University of Washington sought help from epileptics who had electrodes temporarily implanted in their brains as part of a medical treatment aimed at curing their seizures. They asked volunteers to play a simple computer game from the age of Pong: Raise or lower a cursor that is cruising slowly from left to right on a computer screen and make it touch an object above or below it. But the participants could not move. They had to raise it by imagining themselves executing some kind of motion and lower it by "resting" from that act of imagining.
The seven subjects were able to change the trajectory of the cursor. From an engineering perspective, they modulated a narrow frequency of electrical impulses recorded across the primary motor cortex.
"From their perspective, what they're doing is they're either imagining the movement of their hand or their tongue if they want the cursor to move up, or they're relaxing when they want the cursor to move down," said Jeremiah Wander, a bioengineer at the University of Washington, whose study was published Monday in Proceedings of the National Academy of Sciences.
As the subjects saw more success on the screen, however, activity waned in the areas associated with learning a new task, the researchers found.
That change in brain activity corresponds to a familiar moment for anyone who has played a video game or learned to touch-type: You're no longer concentrating on all the motions that come between intention and execution. You shift from cognitive to automatic. The brain reorganizes.
"What's really neat about it is, as people get better and better at this, they tend to stop thinking about: 'OK, I'm imagining moving my hand,' or 'I'm imagining resting,'" said Wander. "They just think about making the cursor move up and down. Oftentimes, when you ask people, 'So, what were you thinking about?' they say: 'Well I don't know; I was just thinking about the cursor going up.' "
The experience, as Wander can testify, can be freaky. "Honestly, people think it's pretty cool," he said. "I've done it myself, using non-invasive signals, and it's really intriguing. It's different from the way you interact with the world."
Connecting brain waves with computerized robotics has long captured the public imagination.
Perhaps the most famous device, used by renowned physicist Stephen Hawking to communicate, is not really a computer-brain interface. It reads and interprets facial movements from the scientist, who suffers from amyotropic lateral sclerosis, commonly known as Lou Gehrig's disease.
A bioengineering company in San Diego last year fitted Hawking with a type of headband, called the iBrain, that will read his electronic brain signals in an attempt to create a more direct link between his thoughts and words.
Many scientists working on the brain-computer conundrum have relied on controls based on scalp-mounted, temporary electrodes such as those used in electroencephalograms, or EEG. Those devices, which tend to focus on motor-control areas of the brain, may not pick up the subtle changes of signalling Wander's team detected.
And a lot goes on south of the brain, where the spine helps convert electrical impulses into complex motion. Brain-computer interfaces will have to replace that function and learn to adapt as the computer in our skull makes the most efficient use of its 1.5 kilograms of organic matter.
-- Los Angeles Times