Donders Institute for Brain, Cognition and Behaviour
Zoek in de site...
Theme 1: Language and Communication

Speech Perception in Audiovisual Communication (SPEAC)

HoSPEAC pic 1w is it possible that we can easily have a conversation with someone even if that someone is shouting over several other talkers, speaks in busy street noise, wears a face mask, talks very fast, has a strange accent, or produces uhm’s all the time?

The human brain is uniquely equipped to successfully perceive the speech of those around us – even in quite challenging listening environments. At the SPEAC lab, we investigate the psychological and neurobiological mechanisms that underlie the exceptional human behavior of spoken communication. We specifically focus on how humans integrate input from multiple modalities, including such visual cues as lip movements, facial expressions, and hand gestures.

SPEAC pic 2

The work we do contributes to a better understanding of how multimodal spoken communication can usually take place so smoothly. For instance, how do seemingly meaningless up-and-down hand movements, known as beat gestures, influence what words we hear? How do listeners manage to understand talkers in challenging listening conditions, such as in loud background noise or when there are competing talkers around? How do listeners ‘tune in’ to a particular talker with his or her own peculiar pronunciation habits? What is the role of context (i.e., acoustic, semantic, and situational context) in speech processing? Finally, we also develop methodological tools to facilitate research in the speech sciences.

The kinds of behavioral experiments we run include (i) playing participants artificially manipulated videos with a speech categorization task (what’s this word?); (ii) speech-in-noise intelligibility experiments (what’s this sentence?); (iii) various psycholinguistic paradigms such as repetition priming (e.g., lexical decision). We use eye-tracking to study the time-course of speech processing on a millisecond timescale (e.g., visual world paradigm). We also apply neuroimaging techniques (EEG, MEG, tACS) to uncover the neurobiological mechanisms involved in the temporal decoding of speech, with a particular focus on oscillatory dynamics.

Wanna know more? Check out our lab website at https://hrbosker.github.io including some demos of the kinds of experiments we run...

Contact
Name: Hans Rutger Bosker
Email: hansrutger.bosker@donders.ru.nl
Visiting address: Donders Centre for Cognition
Thomas van Aquinostraat 4
6525 GD Nijmegen
The Netherlands
Postal address: Donders Centre for Cognition
P.O. Box 9104
6500 HE Nijmegen
The Netherlands

Lab website

Please see https://hrbosker.github.io for a complete list of publications, demos, research resources, lab documentation, and full contact details.

Key publications
  • Bosker, H. R. (2022). Evidence for selective adaptation and recalibration in the perception of lexical stress.  Language and Speech, 65(2), 472-490. doi:10.1177/00238309211030307.
  • Bosker, H. R., & Peeters, D. (2021). Beat gestures influence which speech sounds you hear. Proceedings of the Royal Society B: Biological Sciences, 288: 20202419. doi:10.1098/rspb.2020.2419.
  • Bosker, H. R. (2021). Using fuzzy string matching for automated assessment of listener transcripts in speech intelligibility studies. Behavior Research Methods, 53(5), 1945-1953. doi:10.3758/s13428-021-01542-4.
  • Severijnen, G. G. A., Bosker, H. R., Piai, V., & McQueen, J. M. (2021). Listeners track talker-specific prosody to deal with talker-variability. Brain Research, 1769: 147605. doi:10.1016/j.brainres.2021.147605.
  • Bosker, H. R., & Cooke, M. (2020). Enhanced amplitude modulations contribute to the Lombard intelligibility benefit: Evidence from the Nijmegen Corpus of Lombard Speech. The Journal of the Acoustical Society of America, 147: 721. doi:10.1121/10.0000646.
  • Bosker, H. R., Peeters, D., & Holler, J. (2020). How visual cues to speech rate influence speech perception. Quarterly Journal of Experimental Psychology, 73(10), 1523-1536. doi:10.1177/1747021820914564
Key grants and prizes
  • September 2022 - September 2027:
    ERC Starting grant: How Hands Help Us Hear (HearingHands, #101040276).
  • February 2020 - February 2024:
    Donders internal PhD round: Scrutinizing Speech Recognition through the Lens of Word Learning.
    Co-applicant with James McQueen; hired PhD student: Giulio Severijnen.
  • January 2020:
    Recognized Rising Star of the Association for Psychological Science (APS), USA

Back to:
Theme 1:
Language and Communication

Donders

Research Group
Speech Perception in Audiovisual Communication

Principal Investigator
Dr. Hans Rutger Bosker

Group members

Postdoc
Matteo Maran
Patrick Rohrer

PhD's
Giulio Severijnen
Ronny Bujok
Orhun Ulusahin
Chengjia (Jason) Ye

Interns
Ivy Mok
Yitian Hong