Zoek in de site...

Multimodal Language and Cognition

Welcome to the web page of the Multimodal Language and Cognition Lab led by Aslı Özyürek. Our research aims to understand the complex architecture of our human language faculty and its role in communication and cognition.

To do so, we look beyond the spoken or written language and investigate ways in which language can be expressed through multiple modalities, such as visual (e.g., gestures used by hearing communities and sign languages created by Deaf communities) and auditory (speech) modalities and interactions among them.

Members of the research group, photo taken in 2019 Thus we use a multimodal approach as a window for understanding what it reveals about our language faculty, how it is used in embodied and situated contexts, how it interacts with other domains of cognition (spatial cognition, action, memory, intention), and its neural, cognitive and social foundations.

More information about our scientific mission and current specific projects can be found here.

For additional information about our group and research, see our website of the Max Planck Institute for PsycholinguisticsCheck out also our YouTube channel for more updates about our research and activities in various signed and spoken languages.

New paper published - A Tale of Two Modalities: Sign and Speech Influence Each Other in Bimodal Bilinguals

A new paper by Francie Manhardt, Susanne Brouwer, & Asli Özyürek has been published in  Psychological Science. Curious? You can find the paper (open access) here or read the summary here.Screenshot 2021-03-15 at 17.03.38

About us - in NGT

This information is also available with Dutch subtitles.

More videos in NGT:

  • Sign-Gesture Interface in Second Language Learners
  • First Bimodal Bilingual School in Turkey