The COCOHA project aims to develop a cognitively controlled hearing aid with sophisticated acoustic processing under the control of the user via signals measured from the brain. This device can be of crucial benefit to the widening population of hearing impaired and aging...
The COCOHA project aims to develop a cognitively controlled hearing aid with sophisticated acoustic processing under the control of the user via signals measured from the brain. This device can be of crucial benefit to the widening population of hearing impaired and aging people. Millions of people struggle to communicate in noisy environments particularly the elderly: 7% of the European population are classified as hearing impaired. Hearing aids can effectively deal with a simple loss in sensitivity, but they do not restore the ability of a healthy pair of young ears to pick out a weak voice among many, that is needed for effective social communication. Decisive technological progress has been made in the area of acoustic scene analysis: arrays of microphones and beamforming algorithms, or distributed networks of handheld devices such as smart phones. However uptake of this technology is limited because there is no easy way to steer the device, no way to tell it to direct the processing to the one source among many that the user wishes to attend to. The COCOHA project proposes to use brain signals (EEG) to help steer the acoustic scene analysis, in effect extending the efferent neural pathways that control all stages of processing from cortex down to the cochlea, to govern also the external device. To succeed we must overcome major technical hurdles, drawing on methods from acoustic signal processing and machine learning borrowed from the field of Brain Computer Interfaces. On the way we will probe interesting scientific problems related to attention, electrophysiological correlates of sensory input and brain state, the structure of sound and brain signals.
• Data has been gathered from behavioural and EEG experiments with normal listeners to better understand auditory attentional processes and maximize their measurable correlates.
• New computational methods have been developed to decode subject’s attention from brain signals and derive reliable control signals using recent machine-learning techniques.
• Hardware/software solutions have been investigated to handle the potentially wide range of device configurations (on ear, wearable, distributed and collaborative networks of devices and microphones) in a uniform way.
• Prototypes have been implemented by the industrial partner using hearing aid technology, to investigate feasibility and verify usability.
• An international scientific workshop was organized to bring in the latest expertise, publicize our project, and sensitize the community to its goals.
• Behavioral and EEG experimental paradigms have been complemented with additional techniques such as pupillometry and extended to better probe and understand basic attentional processes.
• Very significant progress has been made with EEG signal processing and classification algorithms, improving the reliability of decoding decisions and decreasing the duration of data recording required to make them.
• New EEG-based control paradigms have been piloted that potentially allow additional control information to be exploited by decoding algorithms.
• Our industrial partner has made progress in developing (and in some cases patenting) techniques likely to be implemented in marketable hearing aid devices.
More info: https://cocoha.org.