Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Multisensory interactions in auditory cortex

MPG-Autoren
/persons/resource/persons84006

Kayser,  C
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84136

Petkov,  CI
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83787

Augath,  M
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Kayser, C., Petkov, C., Augath, M., & Logothetis, N. (2007). Multisensory interactions in auditory cortex. Poster presented at 37th Annual Meeting of the Society for Neuroscience (Neuroscience 2007), San Diego, CA, USA.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-CB3B-9
Zusammenfassung
An increasing body of literature provides compelling evidence that sensory convergence not only occurs in higher association areas, but also in lower sensory regions and even in primary sensory cortices. To scrutinize these early cross-modal interactions, we use the macaque auditory cortex as model and employ combinations of high-resolution functional imaging (fMRI) and electrophysiological recordings.
Using function imaging in alert and anaesthetized animals, we reported that (only) caudal auditory fields are susceptible to cross-modal modulation: The fMRI-BOLD response in these regions was enhanced when auditory stimuli were complemented by simultaneous visual or touch stimulation [see Kayser et al. Neuron 48, 2005 and J. Neurosci. 27(8), 2007]. To investigate the neuronal basis of this cross-modal enhancement, we recorded the activity of local field potentials and single units in alert animals watching complex audio-visual scenes.
Our results show the following: Visual stimuli by themselves, on average, do not drive auditory neurons, but cause responses in low frequency LFPs. Combining visual and auditory stimuli leads to enhanced responses in the low frequency LFP, but to a reduction of firing rates. This audio-visual interaction was significant at the population level, and for about 10 of the neurons when tested individually. The interaction occurs only for well-timed visual stimuli, is strongest when the visual stimulus leads the auditory stimulus by 20-80msec, but is independent of the image structure in the visual stimulus. Smilar visual modulation was found in the auditory core and belt.
Our findings point to a very basic, stimulus unspecific visual input to auditory cortex and clearly support the notion that early sensory cortices are susceptible to cross-modal interactions. Especially, the finding that visual stimuli modulate the firing rates of individual neurons in auditory cortex suggests that the messages transmitted from these regions to higher processing stages do not only reflect acoustical stimuli but are also dependent on their visual context.