de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Investigating factors influencing the perception of identity from facial motion

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83890

Dobs,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83840

Bülthoff,  I
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83871

Curio,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84201

Schultz,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Dobs, K., Bülthoff, I., Curio, C., & Schultz, J. (2012). Investigating factors influencing the perception of identity from facial motion. Poster presented at 12th Annual Meeting of the Vision Sciences Society (VSS 2012), Naples, FL, USA.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B694-A
Abstract
Previous research has shown that facial motion can convey information about identity in addition to facial form (e.g. Hill Johnston, 2001). The present study aims at finding whether identity judgments vary depending on the kinds of facial movements and the task performed. To this end, we used a recent facial motion capture and animation system (Curio et al., 2006). We recorded different actors performing classic emotional facial movements (e.g. happy, sad) and non-emotional facial movements occurring in social interactions (e.g. greetings, farewell). Only non-rigid components of these facial movements were used to animate one single avatar head. In a between-subject design, four groups of participants performed identity judgments based on emotional or social facial movements in a same-different (SD) or a delayed matching-to-sample task (XAB). In the SD task, participants watched two distinct facial movements (e.g. happy and sad) and had to choose whether the same or different actors performed these facial movements. In the XAB task, participants saw one target facial movement X (e.g. happy) performed by one actor followed by two facial movements of another kind (e.g. sad) performed by two actors. Participants chose which of the latter facial movements was performed by the same actor as the one performing X. Prior to the experiment, participants were familiarized with the actors by watching them perform facial movements not subsequently tested. Participants were able to judge actor identities correctly in all conditions, except for the SD task performed on the emotional stimuli. Sensitivity to identity as measured by d-prime was higher in the XAB than in the SD task. Furthermore, performance was higher for social than for emotional stimuli. Our findings reveal an effect of task on identity judgments based on facial motion, and suggest that such judgments are easier when facial movements are less stereotypical.