English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Probing Dynamic Human Facial Action Recognition From The Other Side Of The Mean

MPS-Authors
/persons/resource/persons83871

Curio,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83829

Breidt,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84016

Kleiner,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Curio, C., Giese, M., Breidt, M., Kleiner, M., & Bülthoff, H. (2008). Probing Dynamic Human Facial Action Recognition From The Other Side Of The Mean. In S. Creem-Regehr, & K. Myszkowski (Eds.), APGV '08: Proceedings of the 5th symposium on Applied perception in graphics and visualization (pp. 59-66). New York, NY, USA: ACM Press.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C7CF-E
Abstract
Insights from human perception of moving faces have the potential to provide interesting insights for technical animation systems as well as in the neural encoding of facial expressions in the brain. We present a psychophysical experiment that explores high-level after-effects for dynamic facial expressions. We address specifically in how far such after-effects represent adaptation in neural representation for static vs. dynamic features of faces. High-level after-effects have been reported for the recognition of static faces [Webster and Maclin 1999; Leopold et al. 2001], and also for the perception of point-light walkers [Jordan et al. 2006; Troje et al. 2006]. After-effects were reflected by shifts in category boundaries between different facial expressions and between male and female walks. We report on a new after-effect in humans observing dynamic facial expressions that have been generated by a highly controllable dynamic morphable face model. As key element of our experiment, we created dynamic 'anti-expressions' in analogy to static 'anti-faces' [Leopold et al. 2001]. We tested the influence of dynamics and identity on expression-specific recognition performance after adaptation to 'anti-expressions'. In addition, by a quantitative analysis of the optic flow patterns corresponding to the adaptation and test expressions we rule out that the observed changes reflect a simple low-level motion after-effect. Since we found no evidence for a critical role of temporal order of the stimulus frames we conclude that after-effects in dynamic faces might be dominated by adaptation to the form information in individual stimulus frames.