Neural signals decoded to control robotic arm

Researchers at the Korea Advanced Institute of Science and Technology have developed a mind-reading system that decodes neural signals from the brain during arm movements.

neural signals
The subjects were instructed to perform reaching and grasping movements to indicate the target’s locations in three-dimensional space. (a) Subjects A and B were given the visual cue as a real tennis ball at one of four pseudo-randomized locations. (b) Subjects A and B were given the visual cue as a virtual reality clip showing a series of five phases of a reach-and-grab movement (Image: KAIST)

Described in Applied Soft Computingthe method can be used by a person to control a robotic arm through a brain-machine interface (BMI), which converts nerve signals into commands to control a machine.

Two main techniques monitor neural signals in BMIs, namely electroencephalography (EEG) and electrocorticography (ECoG).

EEG shows signals from electrodes on the surface of the scalp and is non-invasive, relatively inexpensive, safe and easy to use. EEG has low spatial resolution and detects irrelevant neural signals, making it difficult to interpret the intentions of individuals from the EEG.

MORE FROM ROBOTICS

ECoG is invasive and involves placing electrodes directly on the surface of the cerebral cortex below the scalp. Compared to the EEG, the ECoG can track neural signals with much higher spatial resolution and less background noise, but the technique also has shortcomings.

“The ECoG is mainly used to find possible sources of epileptic seizures, meaning the electrodes are placed in different locations for different patients and may not be in the optimal regions of the brain for detecting sensory and movement signals,” says Professor Jaeseung Jeong, a brain scientist at KAIST (Korea Advanced Institute of Science and Technology)† “This inconsistency makes it difficult to decode brain signals to predict movement.”

Professor Jeong’s team developed a new method for decoding neural ECoG signals during arm movements. The system would be based on an ‘echo-state network’, a machine learning system for analyzing and predicting neural signals, and Gaussian distribution, a mathematical probability model.

In the study, the researchers recorded ECoG signals from four individuals with epilepsy as they performed a reach-and-grap task. Because the ECoG electrodes were placed according to the possible sources of each patient’s seizures, only 22 to 44 percent of the electrodes were located in the brain regions responsible for controlling movement.

During the movement task, participants were given visual cues either by placing a real tennis ball in front of them or via a virtual reality headset with a clip of a human arm reaching forward in first-person view. According to KAIST, they were asked to reach forward, grab an object, then bring their hand back and let go of the object while wearing motion sensors on their wrists and fingers. In a second task, they were instructed to imagine reaching forward without moving their arms.

The researchers tracked the signals from the ECoG electrodes during real and imaginary arm movements and tested whether the new system could predict the direction of this movement from the neural signals. They found that the new decoder classified arm movements in 24 directions in three-dimensional space, in both the real and virtual tasks, and that the results were at least five times more accurate than chance. They also used a computer simulation to show that the new ECoG decoder could control the movements of a robotic arm.

The team said the next steps will be to improve the accuracy and efficiency of the decoder, which can be used in a real-time BMI device to help people with movement or sensory impairments.

Abhishek Maheswari
We will be happy to hear your thoughts

Leave a reply

The Bihar Engineering
Logo