22/3-2019
“The COCOHA project started as a dream.”
Professor Thomas Lunner, the manager of the Cognitive Hearing Science group at Eriksholm Research Centre, seems self-confident as he says the words. In collaboration with several international partners, his group just finished the important project COCOHA (Cognitive Control of a Hearing Aid) with a positive review from the European Commission.
The dream was, and is, to create hearing aids, which are able to ‘read your mind’ and understand what you intent to listen to – and then amplify only that. Just like the brain of a normal hearing person is able to do.
”We have shown that it is possible to steer the sound by reading brain waves or by using the direction of the eyes as a proxy for attention. There was an assumption about that already, but to show that it works in real time is something completely new,” explains PostDoc and COCOHA researcher Carina Graversen. She continues:
”This is important because we already consider the next concrete steps to make the dream come true. Controlling the hearing aids with your brain is the future.”
EEG signals picks up where your attention is
From the beginning, the key word for COCOHA has been ‘intent’.
“When people with hearing impairment go to a clinic the first thing they say is: ‘I want a hearing aid that only amplifies the sound I’m interested in’. To make intelligent hearing aids that do that we need a way to decode the intent of the user,” says Professor Thomas Lunner.
Research suggesting that it is possible to pick up where a person attends by reading the brain signals became the basis for the project. Four years later, this has been applied to a hearing aid prototype.
“COCOHA has shown that it is possible with a traditional EEG cap to decode the attention and steer the sound of two different competing talkers. You can now amplify the one you want to listen to with a delay of about 8 seconds and an accuracy of about 90 percent,” explains Thomas Lunner.
Since hearing aid users find themselves in many different kinds of environments – often with several talkers – the results are of course not good enough to be implemented in real life yet. It is also necessary to get equally good results using EarEEG electrodes, since “a full scull is not a nice hearing aid,“ as Thomas Lunner puts it.
The results are still a big step towards steering sound by reading the brain waves, which will at some point be exact enough to significantly optimize hearing aids.
“Eye gaze steering is already really promising”
While working on EEG signals, the project expanded into also looking at another way to understand the intent of the user. Picking up eye gaze signals is another way of reading the attention of the listener, since eye pointing is related to attention.
“It is possible to accurately steer the sound according to where the eyes are pointing, which is a proxy for attention. Eye gaze steering is already really promising. The system reacts really fast. We are still working on the accuracy, but we believe that we will soon reach close to 100 percent. And that can be made in-ear in the future,“ explains Thomas Lunner.