Brain-Computer Interface: "Look Ma! No Hands!"

I have long been confident that within my lifetime, there would come a day when we will no longer need keyboards to control our computers. Our minds will be coupled directly to computer systems to control everything from prosthetic limbs and memory and sensory augmentation devices to vehicles in addition to the more pedestrian computer programs such as word processors and spreadsheets. Perhaps a couple of generations down the line, we will be implanting cellular phones transmitters. We will eventually reach out and touch someone with a mere thought.

What I failed to appreciate, however, was that “within my lifetime,” which I had optimistically hoped would extend to somewhere past 2040 or so, is starting to look more like “within the next couple of decades.”

Eric C. Leuthardt, M.D., an assistant professor of neurological surgery at the WUSTL school of Medicine, and Daniel Moran, Ph.D., assistant professor of biomedical engineering,were able to decode signals from a sensory grid implanted on the surface of a teenager’s brain, and train the teen to control (what else?) a video game using only his imagination. (See a video of the truly wired teen here.)

With the increasing use of Functional MRI as a tool to understand cognitive processes, Brain-computer interface technologies are advancing at a staggering rate alongside dramatic improvements in neurosurgery. This latest effort was able to leverage some of the latest neuro-surgery techniques used to treat epilepsy, wherein a thin grid of electrodes is laminated to the actual surface of the brain in order to triangulate the source location from which seizure-inducing brain activity originates.

From the original paper Figure 1. Examples of electrode placement and ECoG signals. (a) Intra-operative placement of a 64-electrode subdural array. (b) Post-operative lateral skull radiograph showing grid placement. (c) Raw ECoG signals during control of cursor movement. Black and red traces are from one of the electrodes that controlled cursor movement and are examples for the patient resting and imagining saying the word ‘move’, respectively. (d) Spectra for the corresponding conditions for the final run of online performance.

Leuthardt et. al., with their patient’s permission, collected data from the implanted grid and analysed it to decode the motor control signals the brain was sending to move his fingers and tongue.

Figure 4 from the original paper shows: ECoG correlations with joystick movement direction before and during movement. (a) Left and center panels: time courses for left and right movements, respectively. Right panel: the absolute value of the difference between left and right time courses. Movement direction is reflected in ECoG across a wide frequency range, including frequencies far above the EEG frequency range. (b) The correlation between the signal shown in (a) and movement direction over the period of movement execution. (c) Correlation for a single electrode location versus the remote reference electrode. The μ rhythm activity predicts movement direction. In (b) and (c), and indicate negative correlation and positive correlation, respectively, with the amplitude of left movement minus right movement. (d) Average final cursor positions predicted by a neural network from ECoG activity are close to the actual average final cursor positions.

After sorting out which signals controlled which movements, they then wired the live brain signals through an artificial neural network simulation that they trained on the sampled data correlated with the cusror moment. The result was a computer program that acts as a translor from the brain’s language into more standard electronic signals that were then wired up to the famed original Atari video game, Space Invaders. With a mere 20 minutes of training, the patient immediately learned how to clear two levels using just his mind, which is better than I did the first several times I played the game in the seventies with my own two hands.

Figure 2 from the original paper: ECoG control of vertical cursor movement using imagination of specific motor or speech actions to move the cursor up and rest to move it down. The electrodes used for online control are circled and the spectral correlations of their ECoG activity with target location (i.e., top or bottom of screen) are shown. Grids for patients B, C and D are green, blue and red, respectively. The substantial levels of control achieved with different types of imagery are evident. The three-dimensional brain model was derived from MRI data.

It is really interesting to start thinking about computing problems like wireless interfaces (Bluetooth?), power supplies (capacitive coupling of microwaves far from H2O resonant frequencies?), and cooling (blood?) when it has to be IN YOUR HEAD!

Who’s up for really getting wired?

Don’t miss the original paper entitled “A brain–computer interface using electrocorticographic signals in humans” and the WUSTL PR page with the live video.



Filed under Health, Science, Technology

2 responses to “Brain-Computer Interface: "Look Ma! No Hands!"

  1. Anonymous

    Fantastic. I wish this is developed quickly. I would like to be connected to the car navigation system. Now we depend of a lady’s voice that is kind of criptic. Woudn’t that be nice? Phil

  2. Wife at large

    I don’t know … maybe it’s just me, but this is kind of interesting with a liberal dose of creepy. I wonder what everyone will end up doing with their hands once the controls are hard-wired … knitting, anyone?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s