More than 5% of the world’s population suffers from disabling hearing loss. Hearing aids and cochlear implants are crucial for improving their quality of life. However, current hearing technology does not work well in cocktail party scenarios, where several people talk simultaneously. This is mainly because the hearing device does not know which speaker the user is attending to, and so which speaker should be amplified relative to the background noise. In this project, we have developed novel signal processing algorithms for electroencephalography (EEG)-based auditory attention decoding to steer the hearing device towards the attended speaker based on the user’s attention. We propose algorithms that are fast, accurate, and able to adapt automatically to (changes in) the EEG data of individual users. These are crucial ingredients towards the realization of practically viable neuro-steered hearing devices.
This book chapter is published in Brain-Computer Interface Research, collecting the best 12 BCI projects of the year 2022.