English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Meeting Abstract

The importance of body-based cues for travelled distance perception

MPS-Authors
/persons/resource/persons84378

Campos,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83842

Butler,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Campos, J., Butler, J., & Bülthoff, H. (2009). The importance of body-based cues for travelled distance perception. Journal of Vision, 9(8), 1144.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C3D1-8
Abstract
When moving through space, both dynamic visual information (i.e. optic flow) and body-based cues (i.e. proprioceptive and vestibular) jointly specify the extent of a travelled distance. Little is currently known about the relative contributions of each of these cues when several are simultaneously available. In this series of experiments participants travelled a predefined distance and subsequently reproduced this distance by adjusting a visual target until the self-to-target distance matched the distance they had moved. Visual information was presented through a head-mounted display and consisted of a long, richly textured, virtual hallway. Body-based cues were provided either by A) natural walking in a fully-tracked free walking space (proprioception and vestibular) B) being passively moved by a robotic wheelchair (vestibular) or C) walking in place on a treadmill (proprioception). Distances were either presented through vision alone, body-based cues alone, or both visual and body-based cues combined. In the combined condition, the visually-specified distances were either congruent (1.0x) or incongruent (0.7x/1.4x) with distances specified by body-based cues. Incongruencies were created by either changing the visual gain or changing the proprioceptive gain (during treadmill walking). Further, in order to obtain a measure of “perceptual congruency” between visual and body-based cues, participants were asked to adjust the rate of optic flow during walking so that it matched the proprioceptive information. This value was then used as the basis for later congruent cue trials. Overall, results demonstrate a higher weighting of body-based cues during natural walking, a higher weighting of proprioceptive information during treadmill walking, and an equal weighting of visual and vestibular cues during passive movement. These results were not affected by whether visual or proprioceptive gain was manipulated. Adopting the obtained measure of perceptual congruency for each participant also did not change the conclusions such that proprioceptive cues continued to be weighted higher.