15 MAY 2008
How to help a person who cannot move after a stroke or a spinal-cord injury? For years, medical researchers and average folks alike have dreamed about bypassing the damage by reading the brain's intention, and translating that information to affect the real world. In lab tests, such systems have enabled the brain to control a computer without using the spinal cord or muscles.
Courtesy Andrew Schwartz, University of Pittsburgh
Now a University of Pittsburgh group has gone one step further, by training two monkeys to reach for food with a prosthetic arm, again using just the power of thought. It's the first transition from virtual reality to the physical world, says group leader Andrew Schwartz, in the university's department of neurobiology.
The researchers implanted a tiny electrode in the primary motor cortex, a key movement-control center in the brain, where it could monitor the activity of neurons. Then they introduced the monkeys to the robot arm, which they initially controlled with a joystick. "Monkeys have not used a tool before," Schwartz explains. "If they were humans, we would say 'Imagine this is your own arm,' but that's hard to explain to a monkey."
Learning to learn
Next, the monkeys were offered morsels of food, and they did the obvious thing -- reached for it. But since their natural-born forelimbs were trapped inside large tubes, they had to try the prosthetic arm. For the first few hours, as the monkeys learned the mental control necessary to move the arm, the experimenters helped out by accentuating those movement signals that went in the correct direction.
Although this help was eventually stopped, the assist period was crucial to adapting the control software to the brain, Schwartz says. "We have to get a way of taking these signals from the brain and decoding what they mean." Individual neurons, he adds, are "tuned" to a particular movement; one may signal downward, and another to the left. "These neurons have a preferred direction ... and if you listen to them, you aim [the arm] in the direction they are telling you."
During training, if the computer detects a weak "raise the arm" signal, it responds by sending a strong "raise the arm" signal to the arm. As the monkey observes the movement, the association between the mental activity that created the weak signal and the resulting movement grows stronger, and so does the next "raise the arm" signal.
Both visual and gustatory feedback make the movement smoother, more confident, Schwartz says. "The animal is learning to make the neurons fire in a way that produces the action they expect."
Success = a tidbit of food!
Within a few hours of training, the monkeys developed quite a knack for working the arm. One achieved a success rate of 61 percent, while the other, performing an easier task, hit 78 percent.
Granted, the monkeys took three to five seconds to grab the food, longer than the second or two they needed with an original-equipment arm, but the experiment was a real-world proof that thought alone can control physical motion.
And because the action was occurring in the real world, not on a computer screen, the monkeys did more than just the most basic grab-'n-gobble movement. They learned to open the food-bearing claw as it neared the mouth, after realizing that this would speed up the all-important delivery of grapes and marshmallows. If they did not quite place the food in the mouth, they learned to nudge it home with the claw. And when food got smeared on the claw, they paused to lick it off.
"Before, we were operating in virtual reality, computer simulation," says Schwartz. "This was a real-world physical device, so ... all sorts of things happen. The grapes are slipping, the marshmallows are sticking. The animal learns these work-arounds, and that is part of making this feel real."
The monkeys, in other words, are adapting to the situation, reducing their effort while increasing their intake. They were learning to use the mechanical arm as a new limb.
All video stills courtesy © Andrew Schwartz, University of Pittsburgh
The warning label
And what is the downside? In a comment in Nature, John Kalaska of the University of Montreal cautioned that signals from brain electrodes often get noisy and hard to understand after a few weeks or months. Furthermore, the computer technology and human tweaking needed to adapt the system to its user are all more suited to the laboratory than to the real life of patients -- at least for now.
Nonetheless, Schwartz says, the first human trial of a similar system has already begun, with a patient with advanced ALS. Also called Lou Gehrig's disease, this cruel killer can deprive a fully conscious person of all voluntary movement.
The name says it all: These patients are "locked in." Maybe some electrodes, computers and expert help can help them break free.
- David Tenenbaum
• Cortical control of a prosthetic arm for self-feeding, Meel Velliste et al, Nature published online, 28 May 2008.
• Brain control of a helping hand, John F. Kalaska, Nature published online, 28 May 2008.