< Back to previous page

Project

Acoustic beamforming based on auditory attention decoding

Signal processing algorithms in hearing aids and cochlear implants allow to suppress background noise for improved speech intelligibility for the hearing impaired. By using multiple microphones, beamforming techniques can be applied to filter out sound from a target direction, and to suppress the noise from other directions. Traditional binaural beamforming algorithms for hearing devices often assume that the target talker is known or can be derived from the listener’s look direction. However, this assumption is frequently violated in practice, rendering high distortions and sub-optimal noise suppression. Thankfully, recent advances in electroencephalography (EEG) and its applications to auditory attention decoding have offered a potential solution for tracking the listeners auditory attention in a multi-talker environment. This information can be used to steer the hearing aid's signal processing algorithm into filtering out the unattended sound sources and solely emphasizing the attended speaker. During the current doctoral project, different beamforming techniques coupled with EEG-informed auditory attention detection will be explored for optimizing the noise suppression in hearing aids.

Date:19 Oct 2020 →  Today
Keywords:EEG, Beamforming, Auditory Attention Decoding, Speech Enhancement, Noise Suppression
Disciplines:Audio and speech computing, Signal processing
Project type:PhD project