Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Sonja Kotz
  • Maastricht University
    Faculty of Psychology & Neuroscience
    Dept. of Neuropsychology &
    Psychopharmacology
    P.O. Box 616
    6200 MD Maastricht
    The Netherlands
  • +31 43 3881653
In stress-timed languages, the alternation of stressed and unstressed syllables (or... more
In stress-timed languages, the alternation of stressed and unstressed syllables (or 'meter') is an important formal and temporal cue to guide speech processing. Previous electroencephalography studies have shown that metric violations result in an early negative event-related potential. It is unclear whether this 'metric' negativity is an N400 elicited by misplaced stress or whether it responds to error detection. The aim of this study was to investigate the nature of the 'metric' negativity as a function of rule-based, predictive sequencing. Our results show that the negativity occurs independent of the lexical-semantic content. We therefore suggest that the metric negativity reflects a rule-based sequencing mechanism.
Event-related potential (ERP) data in French and German have shown that metric violations (i.e. incorrectly stressed words) in a sentence elicit a P600. Furthermore, French speakers find it difficult to discriminate stimuli that vary in... more
Event-related potential (ERP) data in French and German have shown that metric violations (i.e. incorrectly stressed words) in a sentence elicit a P600. Furthermore, French speakers find it difficult to discriminate stimuli that vary in word stress position and have been labelled as "stress deaf." In the current study we investigated (i) whether French late learners of German can perceive deviations of a regular strong-weak stress pattern (trochee) in German sentences, and (ii) whether the same subjects differ in their electrophysiological response from German monolinguals in a non-linguistic "subjective rhythmization" paradigm. Irrespective of the native language both groups show similar results in the latter paradigm in which isochronous stimulus trains are subjectively converted into a binary strong-weak grouped percept (trochee). However, we report differences between native and non-native speakers of German in the sentence paradigm. In contrast to German native speakers French late learners of German fail to show a P600 component in response to deviations from a regular trochaic stress pattern, although attention was directed to the metric pattern of the sentences. The current data suggest that French stress deafness selectively affects the perception of a strong-weak pattern in sentences while strong-weak grouping of non-linguistic sequences is not language specific. The results imply that linguistic and non-linguistic grouping do not rely on the same neural mechanisms.
Cognitive control enables successful goal-directed behavior by resolving a conflict between opposing action ten- dencies, while emotional control arises as a consequence of emotional conflict processing such as in irony. While negative... more
Cognitive control enables successful goal-directed behavior by resolving a conflict between opposing action ten- dencies, while emotional control arises as a consequence of emotional conflict processing such as in irony. While negative emotion facilitates both cognitive and emotional conflict pro- cessing, it is unclear how emotional conflict processing is affected by positive emotion (e.g., humor). In 2 EEG experi- ments, we investigated the role of positive audiovisual target stimuli in cognitive and emotional conflict processing. Participants categorized either spoken vowels (cognitive task) or their emotional valence (emotional task) and ignored the visu al stim ulus dimension. Behaviora lly, a positive target showed no influence on cognitive conflict processing, but impeded emotional conflict processing. In the emotional task, response time conflict costs were higher for positive than for neutral targets. In the EEG, we observed an interaction of emotion by congruence in the P200 and N200 ERP compo- nents in emotional but not in cognitive conflict processing. In the emotional conflict task, the P200 and N200 conflict effect was larger for emotional than neutral targets. Thus, our results show that emotion affects conflict processing differently as a function of conflict type and emotional valence. This suggests that there are conflict- and valence-specific mechanisms mod- ulating executive control.
Research Interests:
The current efMRI experiment investigated the potential right hemisphere dominance of emotional prosodic processing under implicit task demands. Participants evaluated the relative tonal height (high, medium, low) of intelligible and... more
The current efMRI experiment investigated the potential right hemisphere dominance of emotional prosodic processing under implicit task demands. Participants evaluated the relative tonal height (high, medium, low) of intelligible and unintelligible sentences spoken by a trained female speaker of German with three prosodic contours: happy, angry, and neutral. The results confirm the activation of a bilateral fronto- striato-temporal network with
Unexpectedly occurring task-irrelevant stimuli have been shown to impair performance. They capture attention away from the main task leaving fewer resources for target processing. However, the actual distraction effect depends on various... more
Unexpectedly occurring task-irrelevant stimuli have been shown to impair performance. They capture attention away from the main task leaving fewer resources for target processing. However, the actual distraction effect depends on various variables; for example, only target-informative distractors have been shown to cause costs of attentional orienting. Furthermore, recent studies have shown that high arousing emotional distractors, as compared with low arousing neutral distractors, can improve performance by increasing alertness. We aimed to separate costs of attentional orienting and benefits of arousal by presenting negative and neutral environmental sounds (novels) as oddballs in an auditory-visual distraction paradigm. Participants categorized pictures while task-irrelevant sounds preceded visual targets in two conditions: (a) informative sounds reliably signaled onset and occurrence of visual targets, and (b) noninformative sounds occurred unrelated to visual targets. Results c...
Research Interests:
Emotional prosody carries information about the inner state of a speaker and therefore helps us to understand how other people feel. However, emotions are also transferred verbally. In or- der to further substantiate the underlying... more
Emotional prosody carries information about the inner state of a speaker and therefore helps us to understand how other people feel. However, emotions are also transferred verbally. In or- der to further substantiate the underlying mechanisms of emo- tional prosodic processing we investigated the interaction of both emotional prosody and emotional semantics with event- related brain potentials (ERPs) utilizing a
L2 syntactic processing has been primarily investigated in the context of syntactic anomaly detection, but only sparsely with syntactic ambiguity. In the field of event-related potentials (ERPs) syntactic anomaly detection and syntactic... more
L2 syntactic processing has been primarily investigated in the context of syntactic anomaly detection, but only sparsely with syntactic ambiguity. In the field of event-related potentials (ERPs) syntactic anomaly detection and syntactic ambiguity resolution is linked to the P600. The current ERP experiment examined L2 syntactic processing in highly proficient L1 Spanish-L2 English readers who had acquired English informally around
This study considered a relation between rhythm perception skills and individual differences in phonological awareness and grammar abilities, which are two language skills crucial for academic achievement. Twenty-five typically developing... more
This study considered a relation between rhythm perception skills and individual differences in phonological awareness and grammar abilities, which are two language skills crucial for academic achievement. Twenty-five typically developing 6-year-old children were given standardized assessments of rhythm perception, phonological awareness, morpho-syntactic competence, and non-verbal cognitive ability. Rhythm perception accounted for 48% of the variance in morpho-syntactic competence after controlling for non-verbal IQ, socioeconomic status, and prior musical activities. Children with higher phonological awareness scores were better able to discriminate complex rhythms than children with lower scores, but not after controlling for IQ. This study is the first to show a relation between rhythm perception skills and morpho-syntactic production in children with typical language development. These findings extend the literature showing substantial overlap of neurocognitive resources for processing music and language. A video abstract of this article can be viewed at: http://youtu.be/_lO692qHDNg.
In a previous cross-modal priming study [A. Schirmer, A.S. Kotz, A.D. Friederici, Sex differentiates the role of emotional prosody during word processing, Cogn. Brain Res. 14 (2002) 228-233.], we found that women integrated emotional... more
In a previous cross-modal priming study [A. Schirmer, A.S. Kotz, A.D. Friederici, Sex differentiates the role of emotional prosody during word processing, Cogn. Brain Res. 14 (2002) 228-233.], we found that women integrated emotional prosody and word valence earlier than men. Both sexes showed a smaller N400 in the event-related potential to emotional words when these words were preceded by a sentence with congruous compared to incongruous emotional prosody. However, women showed this effect with a 200-ms interval between prime sentence and target word whereas men showed the effect with a 750-ms interval. The present study was designed to determine whether these sex differences prevail when attention is directed towards the emotional content of prosody and word meaning. To this end, we presented the same prime sentences and target words as in our previous study. Sentences were spoken with happy or sad prosody and followed by a congruous or incongruous emotional word or pseudoword. T...
The orbitofrontal cortex (OFC) is functionally linked to a variety of cognitive and emotional functions. In particular, lesions of the human OFC lead to large-scale changes in social and emotional behavior. For example, patients with OFC... more
The orbitofrontal cortex (OFC) is functionally linked to a variety of cognitive and emotional functions. In particular, lesions of the human OFC lead to large-scale changes in social and emotional behavior. For example, patients with OFC lesions are reported to suffer from deficits in affective decision-making, including impaired emotional face and voice expression recognition (e.g., Hornak et al., 1996, 2003). However, previous studies have failed to acknowledge that emotional processing is a multistage process. Thus, different stages of emotional processing (e.g., early vs. late) in the same patient group could be affected in a qualitatively different manner. The present study investigated this possibility and tested implicit emotional speech processing in an ERP experiment followed by an explicit behavioral emotional recognition task. OFC patients listened to vocal emotional expressions of anger, fear, disgust, and happiness compared to a neutral baseline spoken either with or without lexical content. In line with previous evidence (Paulmann & Kotz, 2008b), both patients and healthy controls differentiate emotional and neutral prosody within 200 ms (P200). However, the recognition of emotional vocal expressions is impaired in OFC patients as compared to healthy controls. The current data serve as first evidence that emotional prosody processing is impaired only at a late, and not at an early processing stage in OFC patients.
Decoding verbal and nonverbal emotional expressions is an important part of speech communication. Although various studies have tried to specify the brain regions that underlie different emotions conveyed in speech, few studies have aimed... more
Decoding verbal and nonverbal emotional expressions is an important part of speech communication. Although various studies have tried to specify the brain regions that underlie different emotions conveyed in speech, few studies have aimed to specify the time course of emotional speech decoding. We used event-related potentials to determine when emotional speech is first differentiated from neutral speech. Participants engaged in an implicit emotional processing task (probe verification) while listening to emotional sentences spoken by a female and a male speaker. Independent of speaker voice, emotional sentences could be differentiated from neutral sentences as early as 200 ms after sentence onset (P200), suggesting rapid emotional decoding.
The present study investigated the automaticity of morphosyntactic processes and processes of syntactic structure building using event-related brain potentials. Two experiments were conducted, which contrasted the impact of local... more
The present study investigated the automaticity of morphosyntactic processes and processes of syntactic structure building using event-related brain potentials. Two experiments were conducted, which contrasted the impact of local subject-verb agreement violations (Experiment 1) and word category violations (Experiment 2) on the mismatch negativity, an early event-related brain potential component reflecting automatic auditory change detection. The two violation types were realized in two-word utterances comparable with regard to acoustic parameters and structural complexity. The grammaticality of the utterances modulated the mismatch negativity response in both experiments, suggesting that both types of syntactic violations were detected automatically within 200 msec after the violation point. However, the topographical distribution of the grammaticality effect varied as a function of violation type, which indicates that the brain mechanisms underlying the processing of subject-verb agreement and word category information may be functionally distinct even at this earliest stage of syntactic analysis. The findings are discussed against the background of studies investigating syntax processing beyond the level of two-word utterances.

And 71 more