54 EINBLICKE
21
determined object and simply filter out other sound sources?
KOLLMEIER: In the laboratory,yes,but commercial hearing aids
are not entirely able to do this yet.We‘reworking on it.The next
stepwith hearing devices is that theymust be able to adapt to
the listening situation at hand.Theymust be able topick out the
desired information and to do it in an "intelligent" way, so that
the desired information isn‘t masked. In addition, there should
be no need for continously switching between channels, as
happens when a television has a poor selection of available
programs– and in the end all you hear is the advertising.
EINBLICKE: But how can a future hearing device "identify"
what I want to listen to?
KOLLMEIER: Thismay startwithgestural control, inother words I
point toor look into the direction I want to listen to – that‘swhat
we‘reworking on in the laboratory. Of coursewe haven‘t got as
far as being able to read aperson‘smindor being able to control
a device with our thoughts yet.That would be wonderful! The
ideal case would be: I‘m standing in a crowd and want to focus
on what a certain person in the crowd is saying. The hearing
device should be able to tune in to that person - without me
having to fiddle around too much with a remote control.
EINBLICKE: Does the technology for this already exist?
KOLLMEIER: In its early stages, yes. We call it Brain-Computer-
Interfaces. By using EEG electrodes we can read a few bits of
information fromthebrain.There‘s Brainball, for instance, where
the players‘ brainwaves control the ball in a game of table foot-
ball; there are feedback systems in which a person can steer
something by concentrating on it or thinking about it. These
are techniques that we would like to apply to hearing devices.
However that‘s all still a longway in the future. At presentweuse
a machine-learning approach for pattern recognition and ask:
Which of the identified objects does the patient want to listen
to? First attempts have beenmade here, but in a very simplified
form. Or we experiment with a combination of sounds and
noises that are modulated at different modulation rates – and
can deduce from the rhythm of the brainwaves which sounds
the patient is concentrating on. All this, as I say, is still at a very
elementary level, but it can be built on and adapted for reality.
EINBLICKE: Do such methods play a role in the Cluster of Ex-
cellence application "Hearing4all"?
KOLLMEIER: Yes, it‘s primarily a matter of basic research here:
for example that we are at all able to develop such acoustic
man-machine interfaces and discover clever signal processing
procedures that integrate models both of the acoustic setting
and human hearing. This requires research, research and more
research.We want to solve the basic problems that stand in the
way of good hearing for everyone. This involves three areas of
research:First,wewant todevelop thediagnostics for hearingdis-
orders.Ourmodelling framework needs tobe improved andwe
need to findout preciselyhowthedisorders "function",andhow
to quantify them as
precisely and effici-
ently as possible in
individual patients.
Secondly, we are
trying to optimise hearing systems, above all by combining
knowledge from"intelligent" hearingdevices andnewauditory
implants - this is where our colleagues at theMedical University
of Hanover play an important role.And thirdlywewant to deve-
lopassistive listening technologies inorder toenable individuals
to participate in social and working life for as long as possible.
This area is called "Assistive Listening Devices". This, too, is an
extremely important area: after all, every second person over
the age of 65 has a hearing impairment that requires treatment.
EINBLICKE: That many?
KOLLMEIER:Thefundamentalproblemnowadays is that toomuch
time is allowed topassbeforepeople start usingahearingdevice.
Men start ten years later than women on average.They are less
Probandin im Oldenburger Haus des Hörens.
A test person at Oldenburg‘s Haus des Hörens.
"We want to solve the basic
problems that stand in the way of
good hearing for everyone."