Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  Dynamic facial expressions prime the processing of emotional prosody

Garrido-Vásquez, P., Pell, M. D. P., Paulmann, S., & Kotz, S. A. (2018). Dynamic facial expressions prime the processing of emotional prosody. Frontiers in Human Neuroscience, 12: 244. doi:10.3389/fnhum.2018.00244.

Item is

Basisdaten

einblenden: ausblenden:
Genre: Zeitschriftenartikel

Dateien

einblenden: Dateien
ausblenden: Dateien
:
Garrido-Vasquez_2018.pdf (Verlagsversion), 2MB
Name:
Garrido-Vasquez_2018.pdf
Beschreibung:
-
OA-Status:
Sichtbarkeit:
Öffentlich
MIME-Typ / Prüfsumme:
application/pdf / [MD5]
Technische Metadaten:
Copyright Datum:
-
Copyright Info:
-
Lizenz:
-

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Garrido-Vásquez, Patricia1, 2, Autor           
Pell, Marc D. Pell3, Autor
Paulmann, Silke4, Autor
Kotz, Sonja A.2, 5, Autor           
Affiliations:
1Department of Experimental Psychology and Cognitive Science, Justus Liebig University Giessen, Germany, ou_persistent22              
2Department Neuropsychology, MPI for Human Cognitive and Brain Sciences, Max Planck Society, Leipzig, DE, ou_634551              
3School of Communication Sciences and Disorders, McGill University, Montréal, QC, Canada, ou_persistent22              
4Department of Psychology, University of Essex, Colchester, United Kingdom, ou_persistent22              
5Department of Neuropsychology and Psychopharmacology, University of Maastricht, the Netherlands, ou_persistent22              

Inhalt

einblenden:
ausblenden:
Schlagwörter: Audiovisual; Cross-modal prediction; Dynamic faces; Emotion; Event-related potentials; Parahippocampal gyrus; Priming; Prosody
 Zusammenfassung: Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expressions, which often precede vocally expressed emotion in real life, can modulate event-related potentials (N100 and P200) during emotional prosody processing. To investigate these cross-modal emotional interactions, two lines of research have been put forward: cross-modal integration and cross-modal priming. In cross-modal integration studies, visual and auditory channels are temporally aligned, while in priming studies they are presented consecutively. Here we used cross-modal emotional priming to study the interaction of dynamic visual and auditory emotional information. Specifically, we presented dynamic facial expressions (angry, happy, neutral) as primes and emotionally-intoned pseudo-speech sentences (angry, happy) as targets. We were interested in how prime-target congruency would affect early auditory event-related potentials, i.e., N100 and P200, in order to shed more light on how dynamic facial information is used in cross-modal emotional prediction. Results showed enhanced N100 amplitudes for incongruently primed compared to congruently and neutrally primed emotional prosody, while the latter two conditions did not significantly differ. However, N100 peak latency was significantly delayed in the neutral condition compared to the other two conditions. Source reconstruction revealed that the right parahippocampal gyrus was activated in incongruent compared to congruent trials in the N100 time window. No significant ERP effects were observed in the P200 range. Our results indicate that dynamic facial expressions influence vocal emotion processing at an early point in time, and that an emotional mismatch between a facial expression and its ensuing vocal emotional signal induces additional processing costs in the brain, potentially because the cross-modal emotional prediction mechanism is violated in case of emotional prime-target incongruency.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 2018-03-072018-05-282018-06-12
 Publikationsstatus: Online veröffentlicht
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.3389/fnhum.2018.00244
PMID: 29946247
PMC: PMC6007283
Anderer: eCollection 2018
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden: ausblenden:
Projektname : -
Grant ID : MOP62867
Förderprogramm : -
Förderorganisation : Canadian Institutes of Health Research (CIHR)

Quelle 1

einblenden:
ausblenden:
Titel: Frontiers in Human Neuroscience
  Kurztitel : Front Hum Neurosci
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: Lausanne, Switzerland : Frontiers Research Foundation
Seiten: - Band / Heft: 12 Artikelnummer: 244 Start- / Endseite: - Identifikator: ISSN: 1662-5161
CoNE: https://pure.mpg.de/cone/journals/resource/1662-5161