Zoek in de site...

Neural bases of gesture-speech integration in children

Kazuki

Kazuki Sekine

Research Kazuki

Child language acquisition is a multimodal phenomenon that takes place through the perception of gestures, eye gaze, posture, and speech. Especially, hand gestures in particular are often used by caregivers to convey semantic information with speech. Furthermore, given that gestures often convey information that the concurrent speech does not always convey, they are an important medium for children to understand speakers’ messages (McNeill, 1992). This leads us to ask how children integrate a speaker’s gesture and speech.

In previous research we found that 3-year-old children can integrate gestures and words, and from the age of six, they can do similar multimodal integration from spoken discourse. However, behavioural measures do not shed light on the neurological basis undResearch Kazuki 2erlying the gesture-speech integration.

Thus, this project examines neurocognitive processing of semantic information from gesture and speech in children and adults by using neuroimaging techniques such as EEG and fMRI. This project will contribute to a field of development of multimodal communication by providing neurobiological data on how gesture and speech are processed in children and adults.

More information about this project can be found on the project website.

Sofia in EEG-experiment

Roy van Veen made a video of his daughter Sofia's participation in one of the experiments. You can see the procedure of the research, the tasks and how Sofia experienced it.


H2020
EU Research and Innovation programme Horizon 2020 -
Marie Skłodowska-Curie Individual Fellowship (2016 - 2018)

> Other research on Gestures