Skip to content

Sections
Personal tools
You are here: Home » Research » Projects » Phase II » IM2.BMI (Brain Machine Interfaces) - lay summary

IM2.BMI (Brain Machine Interfaces) - lay summary

Document Actions

Brain Machine Interfaces

 

IM2.BMI
IP Head: José del R. Millán (IDIAP)
Partners: IDIAPASL/ETHZ.
The idea of controlling machines not manually, but by mere “thinking” (i.e., the brain activity of human subjects) has fascinated humankind since ever, and researchers working at the crossroads of computer science, neurosciences, and biomedical engineering have started to develop the first prototypes of brain-machine interfaces (BMI) over the last decade or so. A BCI monitors the user’s brain activity and translates their intentions into actions—such as moving a wheelchair or selecting a letter from a virtual keyboard—without using activity of any muscle or peripheral nerve. The central tenet of a BCI is the capability to distinguish different patterns of brain activity, each being associated to a particular intention or mental task. Our pioneering work on brain-controlled robots and wheelchairs show that human subjects can operate complex devices using non-invasive electroencephalogram (EEG) signals recorded from electrodes placed on the scalp.

In this IP we will continue our work on brain-controlled robots and neuroprostheses where fast decision-making is critical. In addition, we will pursue a complementary avenue aiming at improving interaction by incorporating EEG correlates of high-level cognitive and affective states into the decision-making module of intelligent devices. Examples of cognitive and affective states are errors, alarms, attention, fatigue, etc. Examples of intelligent devices are wheelchairs and browsers. The ability to detect and adapt to these states would enable the BMI to interact with the user in a much more meaningful way. For instance, we have found that if a BMI fails to recognize the user’s intention, there is an EEG potential associated to the subject’s awareness of the erroneous response. Detection of these error-related potentials will increase tremendously the reliability and performance of any brain-coupled interactive device, which could even learn from its own mistakes to better fit the subject’s preferences and expert knowledge.

Our work will focus on brain-controlled robots because, requiring fast and accurate commands, this is the most challenging application for BMI. In particular, we will put a strong emphasis on architectures to incorporate the user’s cognitive state into the decision-making module of the intelligent device. However, it is worth noting that any result achieved in such a demanding experimental setup could be easily transferred to other interaction domains where the pace at which human operators deliver may not be so fast.

Keywords: Brain-Machine Interfaces, non-invasive electroencephalogram (EEG), statistical pattern recognition, cognitive states, interaction devices, brain-controlled robots.


Last modified 2008-05-19 09:14
 

Powered by Plone