de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

Towards Lean and Elegant Self-Motion Simulation in Virtual Reality

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84170

Riecke,  BE
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84199

Schulte-Pelkum,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83846

Caniard,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Riecke, B., Schulte-Pelkum, J., Caniard, F., & Bülthoff, H. (2005). Towards Lean and Elegant Self-Motion Simulation in Virtual Reality. In IEEE Conference on Virtual Reality (VR '05) (pp. 131-138). Piscataway, NJ, USA: IEEE Computer Society.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D5FB-8
Abstract
Despite recent technological advances, convincing self-motion simulation in virtual reality (VR) is difficult to achieve, and users often suffer from motion sickness and/or disorientation in the simulated world. Instead of trying to simulate self-motions with physical realism (as is often done for, e.g., driving or flight simulators), we propose in this paper a perceptually oriented approach towards self-motion simulation. Following this paradigm, we performed a series of psychophysical experiments to determine essential visual, auditory, and vestibular/tactile parameters for an effective and perceptually convincing self-motion simulation. These studies are a first step towards our overall goal of achieving lean and elegant self-motion simulation in virtual reality (VR) without physically moving the observer. In a series of psychophysical experiments about the self-motion illusion (circular vection), we found that (i) vection as well as presence in the simulated environment is increased by a consistent, naturalistic visual scene when compared to a sliced, inconsistent version of the identical scene, (ii) barely noticeable marks on the projection screen can increase vection as well as presence in an unobtrusive manner, (iii) physical vibrations of the observer's seat can enhance the vection illusion, and (iv) spatialized 3D audio cues embedded in the simulated environment increase the sensation of self-motion and presence. We conclude that providing consistent cues about self-motion to multiple sensory modalities can enhance vection, even if physical motion cues are absent. These results yield important implications for the design of lean and elegant self-motion simulators.