Sunday, June 23, 2013

Technological Telekinesis: The Science of Using Thought to Control Objects

 

It’s ok. You can admit it.

 

You’ve thought about moving stuff with your mind at least once in your life. If not recently, then definitely as a kid. You probably just watched X-men or Star Wars for the first time and thought about how cool it would be to make objects fly around just by thinking it. Unfortunately, as it is with all psychic fantasies, it probably ended with you staring at something for way too long with that classic telekinetic look of intense constipation.

Although we may have failed, a few scientists with some ingenious ideas and noble goals have started projects that are bringing us closer to that dream. Some of these projects are giving people the ability to control robot hands and other tools with their thoughts. But more importantly, this research is pushing the boundaries of what’s possible and increasing the quality of life for paraplegics.

The video below is a just a peak at some of the work in this field. It shows Matt, a quadriplegic patient, who is hooked up to a machine that allows him to use signals from his brain to control a robot hand.

 

As amazing as that is, this was just the early work that happened over seven years ago. More recently, a few research groups (including the one responsible the hand movement you just saw) have made some improvements to these neural prosthetics. Some of the new additions include the development of a full arm, the ability to make more life-like movements and lifting objects.

Cathy, a paraplegic patient who worked with the same research group was able to use signals from her brain to control the upgraded prosthetic and use it to drink coffee.

 

 

It’s seems unreal that someone outside of a movie screen can actually do this. So you might be wondering “How is this possible?”

It’s because scientists have taken advantage of the way our brain cells, called neurons, communicate. In our brains, there are groups of neurons with different jobs. When neurons responsible for body movement or motor neurons, “talk” to your body, they can make muscles move.

A big part of this neuronal communication is electrical. It seems weird to think that any part of you has electricity in it, but it’s true. Very tiny amounts of electricity are generated in individual neurons when they “talk”. Depending on which of your motor neurons is talking, along with how often they talk is how motor neurons decide what muscles move and how.

So just as lights on a marquee are arranged in a different pattern for every word spelled, the same is true for every body movement you make. There is a different combination of motor neurons talking that drive each of your movements. These combinations are known as patterns of motor neuron activity.

It’s these patterns of motor neuron activity in the patient’s brains that scientists are using to control the robot arms.

Unfortunately, unlike Jedi’s, X-Men and little girls named Matilda, if you wanted to use these patterns of motor neuron activity to directly control things other than your body, you’d run into a problem.

Although one pattern of motor neuron activity may make your arms flex and another pattern might get your legs to extend, these mean as much to a robot arm as you yelling gibberish at it with your mouth full. The many years involved in your development and all the experience of driving your movements have tuned these patterns of neuron talk into a special language that only your body really understands.

In order to let the neurons in your head talk to objects outside the body, there needs to be something to bridge the gap and teach the language of the motor neurons to the robot arm.

Understanding this issue scientists developed Brain-Machine Interfaces (BMIs), which act like the robot arm’s version of Google Translate™. It’s the job of BMIs to take the language of the motor neurons and in real time tell the robot arm what the patterns mean in a language it can understand. So when the robot arm receives the translation, it can perform the same motions from the same motor neuron activity that drives the person’s arm motions. These BMIs which seem so complicated and almost defy logic by allowing thought to control objects, are made up of two basic parts:

1) A Sensor. It’s something that can listen to what the neurons of the human or animal are saying. It’s usually composed of tiny metal electrodes that are literally plugged into the exposed brain. It can detect electrical changes in different areas where individual or groups of neurons live. This is connected to….

2) A Computer. The backbone of the operation. It’s used to record and analyze the neuron talk sent to it by the sensor. After analyzing it, it converts the information into a command that is congruent with the desired action. This information is used to drive the connected thing you’re trying to control. It can be a robot arm, a mouse cursor and sometimes even another animal.

One of the first scientists to use this technology to successfully assist people was Dr. John P. Donoghue. He and his team of researchers wanted to help Matt, a guy who a few years earlier was paralyzed from the neck down by a knife attack. By creating a BMI, which the researchers named BrainGate™, the Donoghue team was able to help Matt, Cathy and a few other paraplegics use their thoughts to command physical robot arms and digital mouse cursors to give them the ability to interact with the world around them.

They were able to do this because they had literally plugged sensors directly into Matt’s brain that could essentially read his mind. The sensors were in Matt’s motor cortex, which is the part of the brain where motor neurons live. The BMI was able to look at the different patterns of activity of Matt’s motor neurons when he imagined performing different tasks.

The researchers would put commands on a screen that Matt was supposed to imagine doing like ‘open your hand’. Matt being paralyzed couldn’t actually do these things, but his brain (because it was above the site of injury) could still respond just as if he had control of his body.

So Matt would focus and imagine trying to open his hand. The BMI then recorded the pattern of motor neuron activity that happened while he was imagining opening his hand.

Then they asked him to do this over. And over. And over. And over again.

They did this so many times that when they combined all of the ‘open hand’ patterns they collected, they were able to be really sure about what the ideal ‘open hand’ pattern looked like.

They had to do this to be confident that they had ‘open hand’ associated motor neuron talk and not something like ‘pinky toe flexing’ talk.

Finally, they programmed the robot hand to open only when the BMI detected the same or very similar ‘open hand’ patterns from Matt’s motor cortex. Luckily, the robot arm was able to do respond to Matt’s thoughts in real time, allowing Matt to actively turn his imagination into reality, as soon as he imagined it.

They used the same basic procedure to give the patients mental control over full robot arms and even mouse cursors.

However, instead of trying to invent a new sense for mouse cursor movement, they linked the movements of the mouse to other imagined hand movements.

This time Matt was told to imagine moving his hand to the right. His motor neuron activity associated with right hand movement was detected by the sensor, and the BMI would move the cursor on screen to the right.

Eventually, the BMI was good enough to allow him to play computer games, open up emails and even draw.

clip_image003

After these experiments were published, it wasn’t long before improvements were made by this research team and others. As you could see from the second video, the new additions really came a long way. They started with a basic thought controlled hand and moved to the development of a fully functioning arm that someone could use to drink coffee. Although we can’t use the technology to be Jedi’s just yet, the application of BMIs to help the paralyzed is developing rapidly and seems very promising.

 

TL;DR Paraplegics given a ‘helping hand’ by having computers use the activity from their brain cells to control robotic limbs and mouse cursors

 

References:

Hochberg, L. R., Serruya, M. D., Friehs, G. M., Mukand, J. A., Saleh, M., Caplan, A. H., ... & Donoghue, J. P. (2006). Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature, 442(7099), 164-171.

Collinger, J. L., Wodlinger, B., Downey, J. E., Wang, W., Tyler-Kabara, E. C., Weber, D. J., ... & Schwartz, A. B. (2012). High-performance neuroprosthetic control by an individual with tetraplegia. The Lancet

Pais-Vieira, M., Lebedev, M., Kunicki, C., Wang, J., & Nicolelis, M. A. (2013). A Brain-to-Brain Interface for Real-Time Sharing of Sensorimotor Information. Scientific reports, 3.

 

Author’s note: If you want more information on this research you can check out this video report on the work done by the Donoghue group: https://www.youtube.com/watch?v=ogBX18maUiM

If you thought this was impressive, there is another group of researchers using the same BMI technology with an M. Night Shyamalan-like twist that allows animals to use their thoughts to alter the behavior of other animals. And If you’re lucky, I might tell you about it next time.

No comments:

Post a Comment