de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Visual and proprioceptive interactions in the reproduction of distance traveled

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84378

Campos,  J
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Sun, H.-J., Campos, J., Ellenor, J., & Chan, G. (2004). Visual and proprioceptive interactions in the reproduction of distance traveled. Poster presented at Fourth Annual Meeting of the Vision Sciences Society (VSS 2004), Sarasota, FL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D871-6
Zusammenfassung
To effectively process spatial information in a natural environment, humans typically use a combination of visual, proprioceptive, vestibular, and temporal cues. We assessed the relative contributions of visual and proprioceptive/efferent information to distance reproduction using virtual reality. Subjects (Ss) were instructed to move forward down a straight, virtual hallway by pedaling a stationary bike at a constant speed (with minimal vestibular input), while simultaneously receiving optic flow information through a head mounted display. The virtual environment consisted of an empty, seemingly infinite hallway mapped with random surface texture. Each trial consisted of two distances: a stimulus distance, which varied in length from trial to trial, and a response distance. Ss were required to respond by reproducing the magnitude of the stimulus distance. In one condition, the relation between visual and non-visual information remained congruent. The results showed that Ss could reproduce distance with reasonable accuracy. In a second condition, a visual and proprioceptive incongruency was created through software by varying the optic flow gain (OFG) between the two distances within a trial. While the OFG of one of the distances (either stimulus or response) was held constant, the OFG of the other distance was varied among three values. As a result of the OFG variation, if Ss relied exclusively on vision or exclusively on proprioception, this would lead to different responses. The results showed that when OFG was varied between three different magnitudes, three separate response functions were observed. The magnitude of separation between these response functions seemed to indicate that visual and proprioceptive cues contributed about equally to the final estimate. These results for distance reproduction are comparable to our psychophysical results observed for distance ratio estimation and distance discrimination reported elsewhere (VSS 2003, EBR, 2004).