Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

Recognition of Dynamic Facial Action Probed by Visual Adaptation

MPG-Autoren
/persons/resource/persons83871

Curio,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83829

Breidt,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84016

Kleiner,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Curio, C., Giese, M., Breidt, M., Kleiner, M., & Bülthoff, H. (2010). Recognition of Dynamic Facial Action Probed by Visual Adaptation. In C. Curio, H. Bülthoff, & M. Giese (Eds.), Dynamic Faces: Insights from Experiments and Computation (pp. 47-65). Cambridge, MA, USA: MIT Press.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-BD42-8
Zusammenfassung
This chapter presents a psychophysical experiment in which 3D computer graphic methods were used to generate close-to-reality facial expressions to examine aspects of recognizing dynamic facial expressions in humans. The study shows that high-level aftereffects similar to those shown earlier for static faces are produced by dynamic faces. The findings indicate that the aftereffects, which are consistent for adaptation with dynamic anti-expressions, are highly expression-specific. The chapter also highlights how computer graphics-generated expressions can be used in order to rule out low-level motion aftereffects. Dynamic face stimuli were created by using a three-dimensional face model that is based on the Facial Action Coding System (FACS).