de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

High-precision capture of perceived velocity during passive translations

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84853

Siegle,  JH
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84378

Campos,  JL
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84088

Mohler,  BJ
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84874

Loomis,  JM
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Siegle, J., Campos, J., Mohler, B., Loomis, J., & Bülthoff, H. (2008). High-precision capture of perceived velocity during passive translations. Poster presented at 8th Annual Meeting of the Vision Sciences Society (VSS 2008), Naples, FL, USA.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-C925-7
Abstract
Although self-motion perception is believed to rely heavily on visual cues, the inertial system also provides valuable information about movement through space. How the brain integrates inertial signals to update position can be better understood through a detailed characterization of self-motion perception during passive transport. In this study, we employed an intuitive method for measuring the perception of self-motion in real-world coordinates. Participants were passively translated by a robotic wheelchair in the absence of visual and auditory cues. The traveled trajectories consisted of twelve straight paths, five to six meters in length, each with a unique velocity profile. As participants moved, they pointed continuously toward a stationary target viewed at the beginning of each trial. By using an optical tracking system to measure the position of a hand-held pointing device, we were able to calculate participants' perceived locations with a high degree of spatial and temporal precision. Differentiating perceived location yielded absolute instantaneous perceived velocity (in units of meters per second), a variable that, to the best of our knowledge, has not previously been measured. Results indicate that pointing behavior is updated as a function of changes in wheelchair velocity, and that this behavior reflects differences in starting position relative to the target. During periods of constant, nonzero velocity, the perceived velocity of all participants decreases systematically over the course of the trajectory. This suggests that the inertial signal is integrated in a leaky fashion, even during the relatively short paths used in this experiment. This methodology allows us to characterize such nonveridical aspects of self-motion perception with more precision than has been achieved in the past. The continuous-pointing paradigm used here can also be effectively adapted for use in other research domains, including spatial updating, vection, and visual-vestibular integration.