Master's Programme in Cognitive Neuroscience
Zoek in de site...

Copy to URL Repository Edition

The effects of depression on the semantic processing of emotional language: an ERP study

John van Gortel, dr. Constance Vissers, and dr. Dorothee Chwilla

Nowadays, more and more evidence is accumulating in favor of an embodied perspective on language comprehension. Within this perspective, the brain, the body and the environment interact to make sense of the world. Research has shown interactions between the brain and the body in various cognitive tasks. The interactions include top-down influence of perception, action, mood and emotion on language comprehension. However, not much researchers have addressed the possible influence of mood disorders on language comprehension. We studied the semantic processing of emotional language material in patients suffering from clinical depression compared with nondepressed controls. To this aim, we recorded eventrelated potentials (ERPs) while participants attentively read scenarios containing either affectively positive, negative and neutral words within nonconstraining neutral contexts, after which they were required to respond to a positively or negatively valenced statement. In contrast with other studies, the present results suggest differences in the processing of emotionally affective language between depressed and nondepressed individuals, as reflected by differences in the N400 window. For affectively neutral language, no differences in N400 are found. In addition, the preliminary ERP-results suggest an increase in N400 for positive words compared to negative and neutral words for the patients. This would point to a ‘negativity bias’, an automatic preference for negative information, as is reported more often in depressed patients. Behavioral data suggest a negativity bias for nondepressed controls, but not for patients. Despite some caveats, present results are consistent with the embodied perspective on language processing.

Decoding Tactile Properties of Visually Presented Objects

Claudia S. Lüttke, Marcel van Gerven, Floris P. de Lange

Multimodal perception (i.e. perceiving the world around us using multiple sensory systems simultaneously) requires integration of information from different senses responsible for visual, tactile (touch) or auditory perception. It is unclear whether visual perception of an object can automatically activate a representation in a different sensory modality (e.g. tactile modality) and thereby potentially facilitate multimodal integration. In this functional magnetic resonance imaging (fMRI) study we visually presented objects that differed in the tactile property texture (smooth and rough) and used multivoxel pattern analysis (MVPA) to categorize these stimuli according to their roughness. We further asked whether multimodal representations are automatically triggered or only when task-relevant. To address this issue, we asked subjects on different blocks to either make judgments about the tactile roughness of the stimuli (tactile task) or about their semantic category (semantic task). Classification performance was above chance in two regions of interest (ROIs): First, the collateral sulcus, which is sensitive to texture; and second, the lateral occipital cortex, which is involved in visual and haptic shape processing. We did not observe a modulation of classification accuracy as a function of task. Furthermore, we could not classify texture in the primary and secondary somatosensory cortices. We conclude that a visual object can activate representations of its tactile properties in higher order areas in the occipital lobe but not in somatosensory areas. Since visual and tactile texture perception may converge in medial occipital cortex, this region might facilitate multimodal integration of vision and touch.

The interplay between emotion and attention on the processing of syntactic anomalies: evidence from P600

Martine W.F.T. Verhees, Dorothee J. Chwilla

Little is known about the relationship between language and emotion. Vissers and colleagues (2010) investigated the effects of mood on the processing of syntactic violations, as indexed by P600. An interaction was observed between mood and syntactic correctness for which three explanations were offered: one in terms of syntactic processing, one in terms of heuristic processing, and one in terms of more general factors like attention. The aim of the present article was to further determine the locus of the effects of emotional state on language comprehension by investigating whether, and if so how, more general factors like attention contribute to the mood-related modulation in P600 effect to syntactic anomalies. To this aim we compared the P600 effect to subject-verb agreement errors relative to correct sentences while ERPs were recorded and mood was manipulated by presenting happy or sad film clips. Attention was directed by using a task manipulation: participants either had to focus their attention on the syntactic features of the sentences or they had to focus their attention on the physical features of the sentences. The main findings were as follows: we effectively induced the intended mood. That is after watching happy film clips, participants were happier and after watching sad film clips participants were sadder compared to baseline. For P600, interactions with emotional state, task and correctness were obtained, reflecting that mood and attention both modulated the P600 correctness effect. The mood manipulation led to a reduction in P600 effect for the sad mood condition as compared to the happy mood condition, in the syntactic task. The task manipulation led to a reduction in P600 effect in the physical task as compared to the syntactic task, in the happy mood condition. The present data show that attention can contribute to the mood-related modulation in P600 effect to syntactic anomalies. Future studies will have to take into account that attention can play a role in the emotion-related modulation of language processing.