de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Asymmetrical face perception with in-depth rotated faces

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84180

Ruppertsberg,  AI
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84280

Vetter,  T
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Ruppertsberg, A., Vetter, T., & Bülthoff, H. (1999). Asymmetrical face perception with in-depth rotated faces. Poster presented at 2. Tübinger Wahrnehmungskonferenz (TWK 99), Tübingen, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-E6EB-3
Abstract
Burt Perrett (1997) showed that subject’s judgment of gender and expression were more influenced by the left than by the right side of the face (viewer’s perspective). We investigated whether recognition performance differs for faces rotated to the right or to the left. In the learning stage, subjects were asked to study 10 frontal views of 3D-Cyberware head scans with their respective names for ten minutes. Immediately after they were tested in a naming task, where a face was shown on the computer screen and subjects had to press the corresponding name key on the keyboard. When their error rate was lower than 5 over the last 30 trials they started the actual experiment. At that stage they had named each face at least three times. In a delayed-match-to-sample task subjects were presented a frontal view of a face for 100ms, followed by a mask for 500ms, and finally a side view (+/- 30 and 60 deg) of a face for again 100ms. The task was to assess whether the two views depicted the same person or not. Subjects were asked to respond as fast as possible and their response time and error were recorded In Exp. 1 we found an effect of orientation as expected. But there is a significant difference between the direction of rotation. Subjects made more errors when the faces looked to the left (viewer’s perspective) than when they looked to the right. This was found for familiar and unfamiliar faces. In Exp.2 we made the heads symmetrical to exclude any effect of the face asymmetry. In the learning stage, the pictures were replaced by pictures of symmetrical faces. For the familiar faces we found the same result as in Exp. 1. But for unfamiliar symmetrical heads subjects made more errors when the face was turned to the right. In Exp. 3 we studied whether this result is related to differences in hemispherical processing of faces. The side view could now appear at the fixation cross or +/- 2.6 deg to either side of it. Subjects made fewer errors when the side view was presented at fixation. We were not able to find performance differences depending on the side of the visual field, but rather differences depending on the side of the face. When generalizing to a novel view of a face object-relevant information seems to play a more important role than the specialized processing capabilities of the hemispheres.