Zoek in de site...

How we use both gestures and speech when communicating with each other

Prof. Asli Özyürek, Professor in Linguistics

A specific group of CLS researchers working on language processing focus on the question of how people use both their body and their speech when communicating with other people. Professor Asli Özyürek, “We’re specifically interested in the hands, and what kind of meaning people might convey in their hands during speaking to and interacting with other people. We use our hands in meaningful ways in all kinds of communicative settings, for example when explaining a recipe to a friend or when teaching a language or a maths equation to children. We aim to understand, first, how speakers produce information both with their hands and speech and, second, how listeners understand that information.”

Sign language experimentNoisy situations
One of the group’s recent successes involves a study on speech in noisy situations. The researchers asked people to look at and listen to someone saying “drinking” while that person simultaneously made a drinking gesture. The speech was produced in both a clear or noisy situation. During the task, the researchers studied the participants’ brain activation. Özyürek, “We found that people used multiple areas of their brain to process the combination of speech and gesture. This is a highly relevant finding as many of these areas have long been considered irrelevant to language comprehension. When gestures are used by listeners to disambiguate speech in noisy situations, even more brain areas are activated.” Most notably, the study also showed an overlap of activated brain areas when people were looking at gestures and when they were listening to speech. “This was the first study to show that gestures are not dissociated from our language system,” says Özyürek.

Asli OzyurekLanguage development
The researchers also work with deaf people who use only their body to produce language. Özyürek, “In sign language, there is no spoken modality. People rely on their hands, face and body to communicate.” The researchers aim to uncover the gestural origins of sign language, for example by comparing the signs of Turkish deaf children learning sign language from birth with the signs of deaf children who only start learning sign language at a later age, when they go to school. In addition, they compare the language development of children learning sign language to the language development of speaking children. “For speaking children, concepts like left and right are really hard to understand because the verbal labels are completely arbitrary,” says Özyürek. “In various sign languages, children tap their left arm to refer to the concept of left and they tap their right arm to refer to the concept of right. Our research shows that signing children learn these concepts earlier than speaking children. Apparently, the iconic signs for left and right facilitate childrens’ understanding of the concepts.”

Broad implications
Research conducted by this group of CLS researchers has broad implications. In the Netherlands, for example, deaf children are typically discouraged from signing; they are encouraged to attend regular schools instead and are, as a consequence, not exposed to sign language. Özyürek, “A widespread misunderstanding is that looking at signs would distract children from speech and thus hamper their development and processing of spoken language. Our research shows that the exact opposite is true: learning sign language facilitates learning spoken language, just as much as learning spoken language facilitates learning sign language. Being bilingual both in signed and spoken language is good for a child’s development -as it is being in two different spoken languages. Deaf children therefore benefit most from being exposed to spoken language as well as  to sign language.”