When you are trying to understand someone, it helps if you can see them: a speaker’s gestures help you perceive words and thus benefit speech comprehension. But how strong is the influence of gestures, compared to other information sources in speech, such as the sound itself or the sentences' grammatical structure? And does this influence increase if other information becomes less reliable, for example because of a language learner’s grammar errors? This project investigates how what you see, hear, and know promotes speech comprehension, and how these contributions may change depending on your conversation partner.


Cue integration in multimodal L1 and L2 speech
Creating a naturalistic speech comprehension model
- Duration
- October 2024 until September 2028
- Project member(s)
- F. Cos (Floris) Dr L.J. van Maastricht (Lieke) Dr E. Janse (Esther) Dr H.R. Bosker (Hans Rutger) Dr M. Maran (Matteo) , Clara Martin (Basque Center on Cognition, Brain and Language)
- Project type
- Research