de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Multisensory integration in self-motion

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84378

Campos,  J
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Sun, H., Campos, J., Chan GSW, Zhang, D., & Lee, A. (2003). Multisensory integration in self-motion. Poster presented at Third Annual Meeting of the Vision Sciences Society (VSS 2003), Sarasota, FL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-DB75-4
Zusammenfassung
We assessed the relative contributions of visual and proprioceptive/motor information during self-motion using a distance discrimination task in virtual reality. Subjects (Ss) wore a head-mounted display and rode a stationary bicycle along a straight path in an empty, seemingly infinite hallway with random surface texture. During each trial, Ss traversed two distances: a standard distance and a comparison distance, and subsequently reported whether the second distance was longer than the first distance. The standard distance remained fixed while the comparison distance was varied according to the method of constant stimuli. Visual and proprioceptive incongruency was created through software by varying the optic flow gain (OFG) between the two distances within a trial. If Ss relied exclusively on vision or exclusively on proprioception, OFG variations would lead to different estimates. When OFG was varied between three different magnitudes, three separate psychometric functions were observed, indicating that Ss used the weighted average of visual and proprioceptive cues. The magnitude of the separation between the three psychometric functions depended upon the size of the perceptual conflict. Distance discriminations were also affected by whether OFG was varied during the comparison and/or standard distance. When OFG was only varied in the comparison distance, responses seemed to indicate that visual and proprioceptive cues contributed about equally to the final estimate. However, when OFG was varied in both the standard and comparison distances, Ss appeared to predominantly use vision. These results are reminiscent of the concepts underlying the statistical optimization model, which predicts that sensory information from multiple sources is weighted according to the estimated reliability of each cue. Our results suggest that across trials, the stability or variability of a particular cue contributes to how it is weighted during sensory integration.