de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Visual Vestibular Interactions for Self Motion Estimation

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83842

Butler,  JS
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84978

Smith,  ST
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83808

Beykirch,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Butler, J., Smith, S., Beykirch, K., & Bülthoff, H. (2006). Visual Vestibular Interactions for Self Motion Estimation. In Driving Simulation Conference Europe (DSC Europe 2006) (pp. 1-10). Arcueil, France: Institut National de Recherche sur les Transports et Leur Sécurité.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-CFD7-D
Zusammenfassung
Accurate perception of self-motion through cluttered environments involves a coordinated set of sensorimotor processes that encode and compare information from visual, vestibular, proprioceptive, motor-corollary, and cognitive inputs. Our goal was to investigate the visual and vestibular cues to the direction of linear self-motion (heading direction). In the vestibular experiment, blindfolded participants were given two distinct forward linear translations, using a Stewart Platform, with identical acceleration profiles. One motion was a standard heading direction, while the test heading was randomly varied using the method of constant stimuli. The participants judged in which interval they moved further towards the right. In the visual-alone condition, participants were presented with two intervals of radial optic flow stimuli and judged which of the two intervals represented a pattern of optic flow consistent with more rightward self-motion. From participants’ responses, we compute psychometric functions fo r both experiments, from which we can calculate the participant’s uncertainty in heading direction estimates.