de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

How to simulate realistic forward accelerations on a 6dof motion platform

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83802

Berger,  DR
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84199

Schulte-Pelkum,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Berger, D., Schulte-Pelkum, J., & Bülthoff, H. (2004). How to simulate realistic forward accelerations on a 6dof motion platform. Poster presented at Fourth Annual Meeting of the Vision Sciences Society (VSS 2004), Sarasota, FL, USA.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-D849-2
Zusammenfassung
How are visual and physical motion cues integrated in self-motion perception? We performed a psychophysical study to test how forward accelerations can be realistically simulated. Participants were seated on a 6 dof Stewart motion platform and viewed a computer-generated visual scene on a projection screen (54 ×40.5 ). The visual scene consisted of a randomly structured ground plane and sky. Observers were told that eye height was always 2m above ground. In each of the 180 trials, participants experienced a brief simulated forward acceleration (4s ramp, followed by 2s of constant acceleration) that was presented both as platform motion and motion within the visual scene. After the acceleration, the screen went dark and the platform returned to zero in 6s. Participants used a joystick to indicate the realism of the forward acceleration experienced. They were explicitly told to give high ratings for trials in which they convincingly felt moving forward in accordance with the visual stimulus, and low ratings in trials in which they noticed conflicts. In each trial, stimuli were chosen randomly within fixed intervals for all six varied parameters: visual forward accelerations (0–1.5 m/s^2), platform backwards pitch (0–15 ), brief forward translations of the platform (0–0.5m in 4s), ratio of acceleration/deceleration durations for the translations (0.11–1.5), and up/down noise simulating ground roughness (0–7cm) using low-pass filtered noise (cosine window, 0.3–1s). Multiple hierarchical regression analyses performed for each subject revealed that only two parameters had a clear influence on the ratings: higher platform pitches and higher visual accelerations induced a better impression of being accelerated forward. Brief forward translations of the platform and bumps increased believability for some observers. Interestingly, during sensory conflict between canal and visual/otolith cues, the latter dominated, as predicted by a Bayesian model that holds vision as more reliable.