de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Facial motion can determine facial identity

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84018

Knappmeyer,  B
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84258

Thornton,  IM
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Knappmeyer, B., Thornton, I., & Bülthoff, H. (2001). Facial motion can determine facial identity. Poster presented at First Annual Meeting of the Vision Sciences Society (VSS 2001), Sarasota, FL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-E17D-7
Zusammenfassung
When we speak, laugh or cry our faces move in complex, non-rigid ways. Can such motion patterns influence our perception of facial identity? To explore this issue we took 3D laser scanned heads from the MPI database and animated them using motion sequences captured from different human actors. During an incidental learning phase, observers were exposed to FACE A moving with MOTION A and FACE B moving with MOTION B. Test stimuli consisted of two sets of morphed heads (shaded, no texture) ranging in 10 steps from FACE A to FACE B. One set of morphs were animated using MOTION A, the other with MOTION B. Observers were instructed to indicate whether each test face was structurally more similar to FACE A or FACE B. Across all levels of the morph sequence, motion biased the perception of identity. This bias was particularly strong at the 50 morph level where structural information was completely ambiguous. Here, "FACE A" responses occurred on 80 of trials in which the morph was animated with MOTION A, but on only 40 of trials in which the same morph was animated using MOTION B. We believe these results are the strongest evidence to date that facial motion can be used by observers to determine facial identity. The use of computer animation techniques in conjunction with motion capture technology appears to be a very fruitful direction for future research on dynamic aspects of face processing.