de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Cross-modal integration of sensory information in auditory cortex

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84006

Kayser,  C
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Physiology of Sensory Integration, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84136

Petkov,  C
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83787

Augath,  M
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Kayser, C., Petkov, C., Augath, M., & Logothetis, N. (2007). Cross-modal integration of sensory information in auditory cortex. Poster presented at 31st Göttingen Neurobiology Conference, Göttingen, Germany.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-CE31-3
Zusammenfassung
Traditionally it is assumed that information from different sensory systems merges in higher association cortices. Contrasting this belief, we demonstrate cross-modal integration in primary and secondary auditory cortex. Using a combination of high-resolution functional magnetic resonance imaging (fMRI) and electrophysiological recordings in macaque monkeys, we quantify the integration of visual and tactile stimulation with auditory processing. Integration manifests as enhancement of activity that exceeds a simple linear superposition of responses, i.e. auditory activity is enhanced by the simultaneous presentation of non-auditory stimuli. Audio-somatosensory integration is reliably found at the caudal end and along the lateral side of the secondary auditory cortex. Regions with significant integration respond to auditory but only few to somatosensory stimulation. Yet, combining both stimuli significantly enhances responses. This enhancement obeys the classical rules for cross-modal integration: it occurs only for temporally coincident stimuli and follows the principle of inverse effectiveness; integration is stronger for less effective stimuli. Audio-visual integration is similarly found along the caudal end of the temporal plane in secondary auditory cortex, but also extends into primary auditory fields. Complementing these results from functional imaging, enhancement of neuronal activity is found in electrophysiological recordings of single neuron and population responses. Hence, we conclude that cross-modal integration can occur very early in the processing hierarchy - at the earliest stage of auditory processing in the cortex. Further, this multisensory integration occurs pre-attentive, as demonstrated in anaesthetized animals. Such early integration might be necessary for quick and consistent interpretation of our world and might explain multisensory illusions where a stimulus perceived by one modality is altered by a stimulus in another modality.