de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Natural movie stimuli allow mapping of retinotopy and tonotopy in anesthetized monkey cortex

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83797

Bartels,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84091

Maugath , Moutoussis,  K
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  N
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Bartels, A., Maugath, Moutoussis, K., Zeki, S., & Logothetis, N. (2006). Natural movie stimuli allow mapping of retinotopy and tonotopy in anesthetized monkey cortex. Poster presented at AREADNE 2006: Research in Encoding and Decoding of Neural Ensembles, Santorini, Greece.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D179-1
Zusammenfassung
In traditional functional magnetic resonance imaging (fMRI) carefully controlled stimuli are used to reveal cortical regions that are differentially responsive to distinct stimuli. In human fMRI studies we have shown that the varying intensity of features, such as faces or color, seen in a movie, can be used to map feature selective regions, such as the human V4 complex for color or superior temporal regions (STS) and lateral fusiform cortex (FFA) for faces (Bartels Zeki, 2004). Here we applied the same paradigm in the anesthetized monkey to identify regions involved in processing various low- and highlevel features. The advantage of this approach is that effects of attention or eye-movements can be excluded. In early visual cortex (V1-V3) we found that the BOLD signal was predicted by both, changes in frame-by-frame pixel intensities (luminance changes) as well as by image contrast. These two measures were not correlated with each other in our movie stimulus. Early visual cortex thus seems to code for two independent stimulus dimensions. Responses to each were so specific that we were able to obtain retinotopic maps by correlating voxel-time series with time series of either of these stimulus dimensions as a function of their spatial location in the movie display. In contrast, color and face variations correlated most with BOLD signal changes in V4 and in the STS. In auditory cortex, we were able to obtain tonotopic maps based on the movie soundtrack, by correlating sound intensities at different frequencies with BOLD signal of every voxel. Our results illustrate that, in monkey as in man, movies - even though uncontrolled - allow surprisingly specific mapping of high- as well as low-level features, down to retinotopy and tonotopy.