Skip to content
  • Projects
  • Publications
  • News
  • About us
    • Our team
    • PhD students
    • Eriksholm’s timeline
    • History of hearing aids
    • Partnerships
    • Contact us
  • Projects
  • Publications
  • News
  • About us
    • Our team
    • PhD students
    • Eriksholm’s timeline
    • History of hearing aids
    • Partnerships
    • Contact us
  • Projects
  • Publications
  • News
  • About us
    • Our team
    • PhD students
    • Eriksholm’s timeline
    • History of hearing aids
    • Partnerships
    • Contact us
  • Projects
  • Publications
  • News
  • About us
    • Our team
    • PhD students
    • Eriksholm’s timeline
    • History of hearing aids
    • Partnerships
    • Contact us
Back

Eye Gaze Behavior when Following Conversations

Martha Shiell

Senior Scientist

Eriksholm Research Centre

Sergi Rotger Griful

Senior Research Manager

Eriksholm Research Centre

Eye Gaze Behavior when Following Conversations

Introduction

This work was inspired by the observation that most research on eye gaze and speech has focused on behavior in unnatural situations (e.g., faces speaking single words). Furthermore, typical measures of eye gaze behavior that had been applied in previous research (e.g., fixation duration, saccade rate) only described a behavior averaged over listeners and over time. To be able to use eye gaze for audiological applications, we need methods that access a more nuanced and detailed description.

This work has been partially supported by the Swedish Research Council (Vetenskapsrådet, VR 2017-460 06092 418 Mekanismer och behandling vid åldersrelaterad hörselnedsättning).

Aims

This project aims at generating knowledge that would guide how eye gaze can be used in audiological applications. Potential applications included both the possibility for eye gaze as a control signal in hearing aids but also as a measure of listeners’ experience in real-world settings.

The purpose of this project is (1) to generate a basic understanding of how listeners use their gaze in realistic conversations, and (2) to gain experience generating and analyzing this type of data.

Methodology

We collected data from hearing-impaired participants while they followed a pre-recorded audio-visual conversation with two talkers. 

We studied eye gaze behavior under different conversational turn taking categories using multi-level logistic regression.

Results

We found that when listeners followed a pre-recorded conversation between two talkers, they showed similar eye gaze behaviours at different parts of the conversation. Eye gaze behaviour changed a lot though over different parts of the conversation. This means that there is an advantage for future research on gaze in realistic conversation to use pre-recorded stimuli and Multi level modelling, because it allows control for the idiosyncratic nature of realistic conversation.

We also calculated that, between 65 and 80% of the time, listeners looked at the talker who was taking their speaking turn. There was no systematic change in this percentage when there were pauses, interruptions, or changes between the talkers.

Finally, we observed that when the talkers took turns in the conversation, the listeners tended to look at the new talker by 200 ms after the onset of that talkers’ speech, and the highest chance of looking at the new talker occurred 1.4 seconds after this onset.

This figure illustrates the probably (log odds) of the listener’s gaze to the new talker in a conversational turn (floor transfer) over time. We can observe that listeners start looking at a new talker 200 ms after they start talking

Superman1 - data

Publications

Loading...
Shiell, M. M., Høy-Christensen, J., Skoglund, M. A., Keidser, G., Zaar, J., & Rotger-Griful, S. (2023). Multilevel modeling of gaze from listeners with hearing loss following a realistic conversation. Journal of Speech Language and Hearing Research, 66(11), 4575–4589. https://doi.org/10.1044/2023_jslhr-22-00641
With, S. B. L. (2021). Word recognition compared to conversation understanding in relation to visual benefit and reading span for normal hearing. Master thesis. University of Southern Denmark.
Skoglund, M. A., Andersen, M. R., Shiell, M. M., Keidser, G., Rank, M. L., & Rotger-Griful, S. (2022). Comparing in-ear EOG for Eye-Movement Estimation with Eye-Tracking: Accuracy, calibration, and speech comprehension. Frontiers in Neuroscience, 16. https://doi.org/10.3389/fnins.2022.873201

Team

Martha Shiell

Senior Scientist

Eriksholm Research Centre

Sergi Rotger Griful

Senior Research Manager

Eriksholm Research Centre

Martin Skoglund

Senior Scientist

Eriksholm Research Centre

Gitte Keidser 

Senior Researcher

Eriksholm Research Centre

Johannes Zaar

Senior Scientist

Eriksholm Research Centre

Simon With

Research Audiologist

Eriksholm Research Centre

Jeppe Høy Christensen

Principal Scientist

Eriksholm Research Centre

View all

You may also be interested in

Loading...
Difficulty when communicating with others
CURRENT
An Outcome Measure of Communication Difficulty
4415

Intent Decoding, Personalised Audiology

Difficulty when communicating with others is one of the most disabling consequences of living with h...
Difficulty when communicating with others is one of the most disabling consequences of living with…
Neurophysiological measures of attention capture  
CURRENT
Neurophysiological measures of attention capture  
4373,3008

Cognitive Hearing Science

Our brain is constantly scanning the environment for potentially relevant information outside our cu...
Our brain is constantly scanning the environment for potentially relevant information outside our current focus…

You may also be interested in

Difficulty when communicating with others
CURRENT
An Outcome Measure of Communication Difficulty
4415

Intent Decoding, Personalised Audiology

Difficulty when communicating with others is one of the most disabling consequences of living with h...
Difficulty when communicating with others is one of the most disabling consequences of living with…
Neurophysiological measures of attention capture  
CURRENT
Neurophysiological measures of attention capture  
4373,3008

Cognitive Hearing Science

Our brain is constantly scanning the environment for potentially relevant information outside our cu...
Our brain is constantly scanning the environment for potentially relevant information outside our current focus…

People are our most
valuable source of
insights

Facebook

Instagram

LinkedIn

Youtube

  • Eriksholm Research Centre
  • Rørtangvej 20
  • DK-3070 Snekkersten
  • Denmark
We are a part of Oticon, a world leader in hearing care. We share the same philosophy that people are our main source of insights
Bliv testperson
  • +45 48 29 89 00
  • mail@eriksholm.com
  • Cookie policy
  • Disclaimer

© 2025 Eriksholm – Designed by Aveo web&marketing

Manage consent to cookies
We use cookies to optimize our website and our service.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service expressly requested by the subscriber or user, or solely for the purpose of transmitting a communication via an electronic communication network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is necessary to create user profiles for the purpose of sending advertisements or to track the user on a website or across multiple websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Preferences
{title} {title} {title}