Scientists from MIT have developed a new way for humans to train robots using brain signals and body gestures. Developing robots to do specific and precise tasks requires a huge amount of programming based around the human language.
But now this new technique means robots can be controlled and trained using unconscious brain signals and intuitive hand gestures.The team responsible for the breakthrough developed a way to harness brain signals called “error-related potentials” (ErrPs), which unconsciously occur when people observe a mistake.
System uses unconsciously generated brain signals
The system works by monitoring the brain activity of a person observing a robot at work, if an ErrP occurs because the robot made a mistake, the robot is notified and pauses to wait for a correction from its human observer.The observer can correct the mistake via simple hand gestures that the robots understand through an interface that monitors muscle activity.
On the accompanying video, you can see a robot called ‘Baxter’ moving a power drill to one of three possible targets.When the robot moves to the wrong target, their ErrP’s signals cause the robot to pause.
The human observer then moves their wrist to indicate in which direction and for how far the robot should move their drill.With human supervision, Baxter was able to increase its accuracy from 70 percent to 97 percent.
“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” says CSAIL Director Daniela Rus, who supervised the work.“By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”
The brain signals are picked up by an electrode covered cap using the power of electroencephalography (EEG).The muscle activity is read using electromyography (EMG) via series of electrodes on the users’ scalp and forearm.
While both of these technologies have individual problems, mainly to do with an accuracy in detection, when combined they provide a highly robust system. “By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” says the project’s lead author Joseph DelPreto.
“This helps make communicating with a robot more like communicating with another person.” Excitingly the system is plug and play, meaning any user can be connected to the robot without extensive re-training.