de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Dynamic faces: fMRI reveals timeline specific responses to facial expression changes

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84165

Reinl,  M
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83797

Bartels,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Reinl, M., & Bartels, A. (2011). Dynamic faces: fMRI reveals timeline specific responses to facial expression changes. Poster presented at 41st Annual Meeting of the Society for Neuroscience (Neuroscience 2011), Washington, DC, USA.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B932-A
Abstract
In every day life we are usually exposed to dynamically changing faces rather than to their static snapshots. Despite this, the vast majority of neurophysiology and neuroimaging studies have examined responses to static pictures of faces. Therefore, only little is known about the extent to which neural circuitries exist that are specialized to process dynamic aspects of faces, or whether dynamic faces are processed the same way as static ones. One difficulty in answering this question lies in appropriate control stimuli between static and dynamic conditions, as the latter tend to elicit overall more neural activity. In the present study we circumvented this problem by testing for neural responses that are sensitive to the timeline of facial expression changes. We used a 2x2 factorial design, showing different types of video recordings of facial expressions in an event-related fMRI study. We varied the emotional content (factor ,,emotion“) by using emotional expressions that either increased or decreased in the intensity of fear. Additionally, both types of movies were then played forward in their original timeline (real sequence) or backward (artificial sequence), defining the second factor ,,time“. Our aim was to identify brain areas that react specifically to the provided frame-sequence (time effect: real vs. artificial) or to the differences in the displayed emotion-direction (emotion effect: increase vs. decrease). Time-sensitive responses were found in the superior frontal gyrus (STS), in the occipital face area (OFA) and in a prefrontal set of regions, but not in the fusiform face area (FFA), with generally higher responses to real as opposed to artificial timelines. Emotion-sensitive responses were identified in STS (with larger responses to increasing fear), as well as in a part of the right inferior frontal gyrus belonging to the action observation network (with larger responses to decreasing fear). Thus, dynamic face stimuli elicited timeline specific activity in particular parts of the classic face-processing network (STS and OFA, but not FFA), as well as in higher-level cognitive regions that most likely interpret the meaning of time-dependent information. Our results therefore provide evidence for mid- as well as high-level time-sensitive detectors in human face processing.