How hands help us hear
1 September 2022 until 30 September 2027
Project member(s)
Dr Bosker, H.R. (Hans Rutger) Dr Rohrer, P.L. (Patrick) Maran, M. (Matteo) , Ronny Bujok , Yitian Hong , Floris Cos
Project type

Human communication in face-to-face conversations is inherently multimodal, combining spoken language with a plethora of multimodal cues including hand gestures. Although most of our understanding of human language comes from unimodal research, the multimodal literature suggests that hand gestures are produced in close synchrony to speech prosody, aligning for instance with stressed syllables in free-stress languages like English. Furthermore, prosody plays a vital role in spoken word recognition in many languages, influencing core cognitive processes involved in speech perception, such as lexical activation, segmentation, and recognition. Consequently, viewing gestural timing as an audiovisual prosody cue raises the possibility that the temporal alignment of hand gestures to speech directly influences what we hear (e.g., distinguishing OBject vs. obJECT). However, research to date has largely overlooked the functional contribution of gestural timing to low-level speech perception.

Therefore, HearingHands aims to uncover how gesture-speech coupling contributes to audiovisual communication in human interaction. Its objectives are [WP1] to chart the PREVALENCE of the use of gesture-speech coupling as a multimodal prominence cue in production and perception across a typologically diverse set of languages; [WP2] to capture the VARIABILITY in production and perception of gesture-speech coupling in both neurotypical and atypical populations; [WP3] to determine the CONSTRAINTS that govern gestural timing effects in more naturalistic communicative settings. These objectives will be achieved through cross-linguistic comparisons of gesture-speech production and perception using motion-tracking, testing multimodal integration in autistic and neurotypical individuals, and psychoacoustic tests of gestural timing effects employing eye-tracking and virtual reality. Thus, HearingHands will contribute to our understanding of multimodal human communication, delineating how hands help us hear.

Project website for research output and demos

Research group




Contact information

Hans Rutger Bosker is the principal investigator.