English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Multisensory integration in self-motion

MPS-Authors
There are no MPG-Authors in the publication available
External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Sun, H., Campos, J., Chan, G., Zhang, D., & Lee, A. (2003). Multisensory integration in self-motion. Poster presented at Third Annual Meeting of the Vision Sciences Society (VSS 2003), Sarasota, FL, USA.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-DB75-4
Abstract
We assessed the relative contributions of visual and proprioceptive/motor information during self-motion using a distance discrimination task in virtual reality. Subjects (Ss) wore a head-mounted display and rode a stationary bicycle along a straight path in an empty, seemingly infinite hallway with random surface texture. During each trial, Ss traversed two distances: a standard distance and a comparison distance, and subsequently reported whether the second distance was longer than the first distance. The standard distance remained fixed while the comparison distance was varied according to the method of constant stimuli. Visual and proprioceptive incongruency was created through software by varying the optic flow gain (OFG) between the two distances within a trial. If Ss relied exclusively on vision or exclusively on proprioception, OFG variations would lead to different estimates. When OFG was varied between three different magnitudes, three separate psychometric functions were observed, indicating that Ss used the weighted average of visual and proprioceptive cues. The magnitude of the separation between the three psychometric functions depended upon the size of the perceptual conflict. Distance discriminations were also affected by whether OFG was varied during the comparison and/or standard distance. When OFG was only varied in the comparison distance, responses seemed to indicate that visual and proprioceptive cues contributed about equally to the final estimate. However, when OFG was varied in both the standard and comparison distances, Ss appeared to predominantly use vision. These results are reminiscent of the concepts underlying the statistical optimization model, which predicts that sensory information from multiple sources is weighted according to the estimated reliability of each cue. Our results suggest that across trials, the stability or variability of a particular cue contributes to how it is weighted during sensory integration.