Title Promoter Affiliations Abstract "Signal processing algorithms for attention decoding of brain responses to natural stimuli in brain-computer interfaces" "Alexander Bertrand" "Dynamical Systems, Signal Processing and Data Analytics (STADIUS), Research Group Experimental Oto-rhino-laryngology" "Brain-computer interfaces (BCI) enable the human brain to interact with machines, opening doors to various high-impact applications. However, most experimental BCI paradigms require the user to concentrate on synthetic and repeated stimuli, inducing fatigue and interfering with natural behavior. This unnatural interaction blocks the widespread usage of BCIs in daily-life situations beyond a few niche clinical applications.In this project, we envisage ‘passive’ electroencephalography (EEG)-based BCI applications that track the user’s attention to natural audio-visual stimuli, allowing seamless integration with daily-life activities. However, this shift comes with several fundamental signal processing challenges, such as (1) the low signal-to-noise ratio of neural responses to natural speech or video footage, (2) the strong user-specificity of these responses, and (3) the multi-modal integration of audio-visual stimuli. We will tackle these challenges by designing novel algorithms that are inherently unsupervised (avoiding the need for a dedicated training session for each end-user), and exploit side information such as knowledge of the stimuli and data from other users.We target generic algorithmic tools for EEG-based BCIs with natural stimuli but foresee specific breakthroughs in the context of (1) neuro-steered hearing devices, (2) educational neuroscience, and (3) objective hearing screening in daily-life environments, which act as driver applications." "Signal processing algorithms for attention decoding of brain responses to natural stimuli in brain-computer interfaces." "Alexander Bertrand" "Dynamical Systems, Signal Processing and Data Analytics (STADIUS), Research Group Experimental Oto-rhino-laryngology" "Brain-computer interfaces (BCI) enable the human brain to interact with machines, opening doors to various high-impact applications. However, most experimental BCI paradigms require the user to concentrate on synthetic and repeated stimuli, inducing fatigue and interfering with natural behavior. This unnatural interaction hampers widespread usage of BCIs in daily-life situations beyond a few niche clinical applications.In this project, we envisage ‘passive’ electroencephalography (EEG)-based BCI applications that track the user’s attention to natural audio-visual stimuli, allowing seamless integration with daily-life activities. However, this shift comes with several fundamental signal processing challenges, such as (1) the low signal-to-noise ratio of neural responses to natural speech or video footage, (2) the strong user-specificity of these responses, and (3) the multi-modal integration of audio-visual stimuli. We will tackle these challenges by designing novel algorithms that are inherently unsupervised (avoiding the need for a dedicated training session for each end-user), and that exploit side information such as knowledge of the stimuli and data from other users.Although we target generic algorithmic tools for EEG-based BCIs with natural stimuli, we envisage specific breakthroughs in the context of (1) neuro-steered hearing devices, (2) educational neuroscience, and (3) objective hearing screening in daily-life environments, which act as the driver applications." "Distributed signal processing algorithms for wireless EEG sensor networks with applications in auditory-based brain-computer interfaces" "Alexander Bertrand" "Dynamical Systems, Signal Processing and Data Analytics (STADIUS)" "Electroencephalography (EEG) is a cheap and non-invasive neuromonitoring technique to measure electrical potentials generated by the brain. Recently, the concept of a wireless EEG sensor network (WESN) has been proposed, in which the head is covered with a multitude of EEG nodes with facilities for local signal processing (SP) and wireless communication. Since they are amenable to extreme miniaturization and low-power system design, it is believed that such WESNs are an enabling technology for long-term wearable EEG monitoring. Since the wireless transmission is the most power-hungry component, it is crucial to minimize the amount of data that is to be transmitted. To this end, we will develop novel distributed algorithms to solve multi-channel EEG SP tasks, while avoiding energy-inefficient data centralization. We will focus on algorithms for artifact removal and extraction of brain responses, in particular for two different auditory-based brain-computer interfaces (BCIs)." "Distributed signal processing algorithms for wireless EEG sensor networks with applications in auditory-based brain-computer interfaces." "Alexander Bertrand" "ESAT - STADIUS, Stadius Centre for Dynamical Systems, Signal Processing and Data Analytics" "Electroencephalography (EEG) is a cheap and non-invasive neuromonitoring technique to measure electrical potentials generated by the brain. Recently, the concept of a wireless EEG sensor network (WESN) has been proposed, in which the head is covered with a multitude of EEG nodes with facilities for local signal processing (SP) and wireless communication. Since they are amenable to extreme miniaturization and low-power system design, it is believed that such WESNs are an enabling technology for long-term wearable EEG monitoring. Since the wireless transmission is the most power-hungry component, it is crucial to minimize the amount of data that is to be transmitted. To this end, we will develop novel distributed algorithms to solve multi-channel EEG SP tasks, while avoiding energy-inefficient data centralization. We will focus on algorithms for artifact removal and extraction of brain responses, in particular for two different auditory-based brain-computer interfaces (BCIs)." "Signal processing for EEG-based brain-computer interfaces" "Alexander Bertrand" "Processing Speech and Images (PSI), Dynamical Systems, Signal Processing and Data Analytics (STADIUS)" "Electroencephalography (EEG)-based brain-computer interface (BCI) offers advantages in terms of temporal resolution, cost, and mobility, and is therefore the most popular non-invasive BCI modality to date. However, the current paradigm has a narrow application scope as it relies heavily on synthetic stimuli, multi-trial averaging techniques and the active participation of subjects. This project takes a step forward to real-world settings, aiming to identify and quantify the temporal coupling between natural video clips and elicited EEG responses. We will explore multi-set canonical correlation analysis and its nonlinear extensions to enhance the EEG signals and link them with (generic) stimulus features. In collaboration with researchers in the PSI group, we aim to find a good deep joint video-EEG embedding and perform visual attention decoding." "Representation learning for visual brain-computer interfaces" "Tinne Tuytelaars" "Dynamical Systems, Signal Processing and Data Analytics (STADIUS), Processing Speech and Images (PSI)" "Brain-computer interfaces (BCI) facilitate high-bandwidth human-computer and, ultimately, human-human communications. For that, brain activity to present or past sensory experiences are decoded. Inspired by recent breakthroughs toward decoding responses to speech, this project aims to design a new data-driven methodology to identify and quantify the temporal coupling between natural video footage and its elicited responses. The most popular non-invasive method of capturing postsynaptic potentials is electroencephalography (EEG) because it is inexpensive, portable, and provides excellent temporal resolution to track neural responses that are time-locked to a sensory stimulus. However, data scarcity and a low signal-to-noise ratio are major challenges. In addition, traditional BCI paradigms rely on controlled environments and the active user participation, making the integration of such paradigms into real-world use cases very difficult." "Brain-Computer Interfaces With Machine Learning" "Joni Dambre" "Department of Electronics and information systems" "Brain-Computer Interfaces are used for communication through brain signals, created on application of a stimulus. The usability of these systems is highly dependend on the speed and the need for calibration. The use of Machine Learning techniques is investigated to eliminate the need for calibration. Also the combination of these techniques with the stimuluspattern is investigated to speed up the system." "Exploring the neural coding in behaving animals by novel optogenetic, high-density microrecordings and computational approaches: Towards cognitive Brain-Computer Interfaces (ENLIGHTENMENT)." "Michele Giugliano" "Theoretical neurobiology" "In this project we aim to investigate the mechanisms involved in memory storage in the brain by a combination of advanced multisite, single unit neural activity monitoring, closed-loop patterned and cell specific activations, and computational techniques, that would allow developing ways to stimulate brain networks in an activity-driven fashion. Combining neuroscience, neuroengineering and computational methods, we intend to create a technological platform for directly interacting with cell assemblies in a two-way dialogue." "Towards natural brain-computer interfaces: representation learning for identifying the temporal encoding of realistic video footage in EEG" "Alexander Bertrand" "Dynamical Systems, Signal Processing and Data Analytics (STADIUS), Processing Speech and Images (PSI)" "Brain-computer interfaces (BCIs) facilitate interaction between the brain and a computer or machine. Electroencephalography (EEG) is by far the most popular non-invasive BCI modality, because it is cheap, mobile, and it has an excellent temporal resolution to track neural responses that are time-locked to a sensory stimulus. However, traditional BCI paradigms heavily rely on synthetic and controlled sensory stimuli and an active participation of the user, making it very hard to integrate such paradigms in practical ‘everyday-life’ use cases. If BCI technology would be able to cope with uncontrolled and natural sensory stimuli, it would be able to naturally blend in with normal behavior and activities of the user. Inspired by recent breakthroughs towards decoding EEG responses to speech, this project has the challenging goal to design a new data-driven methodology to identify and quantify the temporal coupling between natural video footage and its EEG responses. To this end, we will leverage representation learning techniques from the field of computer vision, and rethink them in combination with recent insights in EEG decoding and visual BCI. We will also investigate whether and how this framework can be used to track visual and spatial attention to natural stimuli. Not only would this be a game-changing tool for various experiment-driven research fields in neuroscience and medical sciences, it would also pave the way towards many new BCI applications in various domains." "Brain-Computer Interface for real-life applications" "Kevin De Pauw" "University of Warwick, Physiotherapy, Human Physiology and Anatomy" "Throughout the years MFYS has built expertise around the evaluation process of different types of robotic prototypes and commercially available devices, i.e. industrial exoskeletons, motorized lower-extremity prostheses, orthoses and cobots, in terms of (psycho-)(electro-) physiological and biomechanical data gathering. Motorized robotic devices are being launched on the market, because of the significant impact on reducing physical and cognitive load, and improving the comfort of the user and in general the quality of life. However, a smooth human-robot interaction is still an issue and, consequently, motorized robots are scarce on the market, e.g. PowerKnee (Össur, Iceland) and the Cray X (German Bionics, Germany) are the sole motorized knee prosthesis and industrial exoskeleton, respectively. To obtain an optimal human-robot interaction both the design of software and hardware should be improved. The missing link for an optimal human-robot interaction at the level of software is the integration of neural information, coming from the muscle and/or the brain, into the design of a robotic device. The current project Brain-Computer Interface for real-life applications focusses on the level of the brain, since human movement is initiated at the motor cortex. This means that electrical brain activity using electro-encephalography precedes before the onset of human movement. When successfully extracting pre-movement onset indicators and integrating this information into the design of a robotic device, the device will immediately respond to the needs of the user. The major goal of the project proposal is to develop software for brain-computer interface (BCI) integration into robotic devices for daily use. To achieve this goal the project is divided into several work packages. In the first year, a thorough literature search on state-of-the-art neural networks within the BCI domain will be conducted and first single-subject experiments to set up a first software environment. The second year will be allocated to data acquisition via open sources and well-designed experiments. The plan for the third year is to develop, train and validate the model, and integrate the model into the design of a robotic device (industrial exoskeleton or prosthetic device). Eventually, the robotic prototype with integrated AI will be evaluated in laboratory conditions and during real-life activities in the fourth year. The challenging task to control a robotic device with the brain requires a multidisciplinary approach. The collaboration with ETIS is an added value, since ETIS Lab has years of experience with BCI research and systems. Our candidate, Arnau Dillen, has been collaborating in this field of research at the AI Lab of VUB for his master thesis (in collaboration with MFYS). As a computer scientist and software engineer Arnau’s profile fits within the project proposal to successfully complete the predetermined deliverables. To conclude, the project proposal includes translational research in which a multidisciplinary collaboration is crucial. Software development will be mainly conducted at ETIS, whereas the more practical side of the project and BCI integration in the robotic device will be performed at MFYS."