English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Neuroperception: Facial expressions linked to monkey calls

MPS-Authors
/persons/resource/persons83932

Ghazanfar,  AA
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Ghazanfar, A., & Logothetis, N. (2003). Neuroperception: Facial expressions linked to monkey calls. Nature, 423(6943), 937-938. doi:10.1038/423937a.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-DC45-8
Abstract
The perception of human speech can be enhanced by a combination of auditory and visual signals1, 2. Animals sometimes accompany their vocalizations with distinctive body postures and facial expressions3, although it is not known whether their interpretation of these signals is unified. Here we use a paradigm in which 'preferential looking' is monitored to show that rhesus monkeys (Macaca mulatta), a species that communicates by means of elaborate facial and vocal expression4, 5, 6, 7, are able to recognize the correspondence between the auditory and visual components of their calls. This crossmodal identification of vocal signals by a primate might represent an evolutionary precursor to humans' ability to match spoken words with facial articulation.