de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Neural correlates of visual self-motion cues and visual pursuit investigated using fMRI

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83912

Fischer,  E
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84063

Logothetis,  NK
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83797

Bartels,  A
Department Physiology of Cognitive Processes, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Fischer, E., Bülthoff, H., Logothetis, N., & Bartels, A. (2008). Neural correlates of visual self-motion cues and visual pursuit investigated using fMRI. Poster presented at 38th Annual Meeting of the Society for Neuroscience (Neuroscience 2008), Washington, DC, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-C679-7
Zusammenfassung
For the successful estimation of self-motion based on visual cues it is necessary to take self-induced motion signals into account, such as those induced by eye-movements. In this fMRI study we used stimulus conditions that allowed us to differentiate neural responses to (a) retinal motion, (b) eye-movements (visual pursuit) and (c) objective motion. Responses to these three motion cues were measured in context of two types of visual stimuli, namely moving 2D dot-sheets and 3D-expanding flow fields. An additional localizer experiment segregated responses to contra- and ipsi-lateral stimulation as well as to full field coherent expansion as opposed to trajectory matched scrambled random motion. We found that MT/V5 and MST responded primarily to retinal motion and to eye-movements. More parietal regions such as V7 and IPS (intra-parietal sulcus) and a region recently implicated in self-motion processing, the cingulate sulcus visual area (CSv), seem to be driven by all three motion cues. The localizer experiment revealed that all of these regions responded almost exclusively to coherent motion types, while MT+/V5+ also responded, but less strongly, to the matched random motion display. CSv differed from all other regions in that it favored 2D translational coherent motion over 3D expanding flow fields, and in that its responses to ipsi- and contralateral flow were indistinguishable. It thus appears to be a strong candidate for integrating translational motion signals of retinal and non-retinal origin. Area V3A/B differed from most other motion processing regions in that it was primarily affected by objective motion, and also, but less, by visual pursuit. Furthermore, in the localizer it responded equally to coherent 3D flow and to the random motion stimulus. This suggests that V3A/B processes differential rather than coherent or self-induced motion. Our results lead us to suggest that there is a clear functional segregation among higher level motion processing regions in context of self-motion processing cues. It remains to be resolved to which extent the distinct regions inter-operate in a hierarchical or rather in a parallel fashion.