< Back to previous page

Project

Towards natural brain-computer interfaces: representation learning for identifying the temporal encoding of realistic video footage in EEG

Brain-computer interfaces (BCIs) facilitate interaction between the brain and a computer or machine. Electroencephalography (EEG) is by far the most popular non-invasive BCI modality, because it is cheap, mobile, and it has an excellent temporal resolution to track neural responses that are time-locked to a sensory stimulus. However, traditional BCI paradigms heavily rely on synthetic and controlled sensory stimuli and an active participation of the user, making it very hard to integrate such paradigms in practical ‘everyday-life’ use cases. If BCI technology would be able to cope with uncontrolled and natural sensory stimuli, it would be able to naturally blend in with normal behavior and activities of the user. 

Inspired by recent breakthroughs towards decoding EEG responses to speech, this project has the challenging goal to design a new data-driven methodology to identify and quantify the temporal coupling between natural video footage and its EEG responses. To this end, we will leverage representation learning techniques from the field of computer vision, and rethink them in combination with recent insights in EEG decoding and visual BCI. We will also investigate whether and how this framework can be used to track visual and spatial attention to natural stimuli. Not only would this be a game-changing tool for various experiment-driven research fields in neuroscience and medical sciences, it would also pave the way towards many new BCI applications in various domains.

Date:1 Jan 2022 →  Today
Keywords:natural brain-computer interfaces
Disciplines:Computer vision, Pattern recognition and neural networks, Biomedical signal processing