Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

EEG brain dynamics during processing of static and dynamic facial emotional expression

MPG-Autoren
/persons/resource/persons84005

Müller V, Kaulard,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Perdikis, D., Müller V, Kaulard, K., Wallraven, C., & Lindenberg, U. (2012). EEG brain dynamics during processing of static and dynamic facial emotional expression. Poster presented at 1st Conference of the European Society for Cognitive and Affective Neuroscience (ESCAN 2012), Marseille, France.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-B79E-E
Zusammenfassung
Humans recognize facial emotional expressions (FEEs) better when FEEs are presented dynamically than through static images. Wallraven et al. 2008 propose that humans are sensitive to the natural dynamics of FEEs. Moreover, PET/fMRI studies suggest that differentiated brain networks process static and dynamic FEEs. However, in most cases, dynamic FEEs have been created out of static ones, using linear morphing techniques. Together with the low time resolution of PET/fMRI, such studies fail to capture the modulation of the activated brain networks by the subtle (and highly nonlinear) dynamics of FEEs. Our ongoing study investigates EEG responses to static and dynamic FEEs drawn from an ecologically valid database (Kaulard et al. 2008, Kaulard et al. 2009). “Happy” and “angry” FEEs performed by two male and two female actors are displayed to twenty female participants in an “oddball” experimental paradigm. Blocks of either dynamic or static stimuli that differ in their emotional content (“happy” versus “angry” and reverse) are presented in a pseudorandom order. The task consists of pressing a keyboard button upon appearance of a deviant stimulus. Data analysis focuses on synchrony and nonlinear coupling of sensor as well as source dynamics (as a bridge to PET/fMRI studies), both in the time-frequency and in the phase-space domain, to identify the brain networks that emerge and evolve dynamically in each condition. Preliminary results from pilot data analysis confirm the PET/fMRI findings of enhanced and differentiated brain activations for dynamic FEEs compared to static ones.