de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Neuroperception: Facial expressions linked to monkey calls

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83932

Ghazanfar,  AA
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Ghazanfar, A., & Logothetis, N. (2003). Neuroperception: Facial expressions linked to monkey calls. Nature, 423(6943), 937-938. doi:10.1038/423937a.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-DC45-8
Abstract
The perception of human speech can be enhanced by a combination of auditory and visual signals1, 2. Animals sometimes accompany their vocalizations with distinctive body postures and facial expressions3, although it is not known whether their interpretation of these signals is unified. Here we use a paradigm in which 'preferential looking' is monitored to show that rhesus monkeys (Macaca mulatta), a species that communicates by means of elaborate facial and vocal expression4, 5, 6, 7, are able to recognize the correspondence between the auditory and visual components of their calls. This crossmodal identification of vocal signals by a primate might represent an evolutionary precursor to humans' ability to match spoken words with facial articulation.