Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Early visual and auditory processing rely on modality-specific attentional resources

MPG-Autoren
/persons/resource/persons19833

Maess,  Burkhard
Methods and Development Unit MEG and EEG: Signal Analysis and Modelling, MPI for Human Cognitive and Brain Sciences, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Keitel, C., Maess, B., Schröger, E., & Müller, M. M. (2013). Early visual and auditory processing rely on modality-specific attentional resources. NeuroImage, 70, 240-249. doi:10.1016/j.neuroimage.2012.12.046.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-000E-F40F-F
Zusammenfassung
Many everyday situations require focusing on visual or auditory information while ignoring the other modality. Previous findings suggest an attentional mechanism that operates between sensory modalities and governs such states. To date, evidence is equivocal as to whether this ‘intermodal’ attention relies on a distribution of resources either common or specific to sensory modalities. We provide new insights by investigating consequences of a shift from simultaneous (‘bimodal’) attention to vision and audition to unimodal selective attention. Concurrently presented visual and auditory stimulus streams were frequency-tagged to elicit steady-state responses (SSRs) recorded simultaneously in electro- and magnetoencephalograms (EEG/MEG). After the shift, decreased amplitudes of the SSR corresponding to the unattended sensory stream indicated reduced processing. We did not observe an amplitude increase of the SSR corresponding to the attended sensory stream. These findings are incompatible with a common-resources account. A redistribution of attentional resources between vision and audition would result in simultaneous processing gain in the attended sensory modality and reduction in the unattended sensory modality. Our results favor a modality-specific-resources account, which allows for independent modulation of early cortical processing in each sensory modality.