de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Multisensory interactions in primate auditory cortex: fMRI and electrophysiology

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84006

Kayser,  C
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84136

Petkov,  CI
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Kayser, C., Petkov, C., & Logothetis, N. (2009). Multisensory interactions in primate auditory cortex: fMRI and electrophysiology. Hearing Research, 258(1-2), 80-88. doi:10.1016/j.heares.2009.02.011.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-C1C8-E
Abstract
Recent studies suggest that cross-modal integration does not only occur in higher association cortices but also in early stages of auditory processing, possibly in primary or secondary auditory cortex. Support for such early cross-modal influences comes from functional magnetic resonance imaging experiments in humans and monkeys. However we argue that the current understanding of neurovascular coupling and of the neuronal basis underlying the imaging signal does not permit the direct extrapolation from imaging data to properties of neurons in the same region. While imaging can guide subsequent electrophysiological studies, only these can determine whether and how neurons in auditory cortices combine information from multiple modalities. Indeed, electrophysiological studies only partly confirm the findings from imaging studies. While recordings of field potentials reveal strong influences of visual or somatosensory stimulation on synaptic activity even in primary auditory cortex, single unit studies find only a small minority of neurons as being influenced by non-acoustic stimuli. We propose the analysis of the information coding properties of individual neurons as one way to quantitatively determine whether the representation of our acoustic environment in (primary) auditory cortex indeed benefits from multisensory input.