Perhaps the best teams are those in which individuals are able to anticipate the next moves of their teammates. By anticipating that next move, team members can reduce delays by providing one another with just what they need, right when they need it. And if those team members happen to be human and robot, giving the robot the ability to read human intentions could also help to keep people safe in these collaborative partnerships.
A team at Loughborough University has taken up the challenge of helping robots to understand human intention in industrial settings, where robots sometimes work closely with humans to offer assistance. Recognizing that all human movements are evaluated in the brain before they occur, they set out to measure and interpret these brain signals to give robots a heads-up about what a person is about to do.
To accomplish this feat, an electroencephalogram (EEG) was used to record brain activity from the frontal lobe of eight participants. These volunteers were asked to sit at a desk, while wearing the EEG, and to watch a computer screen. When a letter was displayed on the screen, they were instructed to press that same key on a keyboard. Data from a motion sensor was paired with the EEG data to determine what action was taken, and when.
Classifying EEG data (📷: A. Buerkle et al.)
This data was used to train a Long Short-Term Memory Recurrent Neural Network to recognize an intention to move either the left or right arm. Validation of this model showed that it was able to signal a human’s intent to move up to 513 milliseconds before the action took place.
At present, collaborative robots are infrequently used in industrial settings due to safety concerns. And where these robots are in use, they are typically slowed down far below their optimal speed to prevent mishaps. With a method such as the one developed by the Loughborough University team, the potential of industrial robot collaboration may be unlocked.
Before we get ahead of ourselves, there is still some work to be done. The study reported in the paper only involved eight individuals, and it was in a controlled, lab setting. Larger studies are needed to confirm these findings, and in more realistic, industrial-like scenarios where the pressures and distractions of a job may cloud the EEG signal. These concerns aside, this work presents an important proof of concept that may help robots to better understand us in our interactions with them in the future.