English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

What are the properties underlying similarity judgments of facial expressions?

MPS-Authors
/persons/resource/persons84005

Kaulard,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83877

de la Rosa,  S
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Project group: Cognitive Engineering, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84201

Schultz,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83911

Fernandez Cruz,  AL
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84298

Wallraven,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Kaulard, K., de la Rosa, S., Schultz, J., Fernandez Cruz, A., Bülthoff, H., & Wallraven, C. (2011). What are the properties underlying similarity judgments of facial expressions?. Poster presented at 34th European Conference on Visual Perception (ECVP 2011), Toulouse, France.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-BA9C-7
Abstract
Similarity ratings are used to investigate the cognitive representation of facial expressions. The perceptual and cognitive properties (eg physical aspects, motor expressions, action tendencies) driving the similarity judgments of facial expressions are largely unknown. We examined potentially important properties with 27 questions addressing the emotional and conversational content of expressions (semantic differential). The ratings of these semantic differentials were used as predictors for facial expression similarity ratings. The semantic differential and similarity-rating task were performed on the same set of facial expression videos: 6 types of emotional (eg happy) and 6 types of conversational (eg don’t understand) expressions. Different sets of participants performed the two tasks. Multiple regression was used to predict the similarity data from the semantic differential questions. The best model for emotional expressions consisted of two emotional questions explaining 75 of the variation in similarity ratings. The same model explained significantly less variation for conversational expressions (38). The best model for those expressions consisted of a single conversational question explaining 44 of the variation. This study shows which properties of facial expressions might affect their perceived similarity. Moreover, our results suggest that different perceptual and cognitive properties might underlie similarity judgments about emotional and conversational expressions.