de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Privacy Policy Disclaimer Contact us
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83842

Butler,  JS
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Campos, J., Butler, J., & Bülthoff, H. (2014). Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies. Experimental Brain Research, 232(10), 3277-3289. doi:10.1007/s00221-014-4011-0.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0027-7FB1-5
Abstract
Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7⁽×⁾, 1.0⁽×⁾, 1.4⁽×⁾; Exp. 1) or the proprioceptive gain across trials (0.7⁽×⁾, 1.0⁽×⁾, 1.4⁽×⁾; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.