English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Top-Down and Multi-Modal Influences on Self-Motion Perception in Virtual Reality

Riecke, B., Västfjäll, L., & Schulte-Pelkum, J. (2005). Top-Down and Multi-Modal Influences on Self-Motion Perception in Virtual Reality. In 11th International Conference on Human-Computer Interaction (HCI International 2005) (pp. 1-10). Mahwah, NJ, USA: Erlbaum.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Riecke, BE1, Author           
Västfjäll, L, Author
Schulte-Pelkum, J1, Author           
Salvendy, G., Editor
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract: INTRODUCTION: Much of the work on self-motion perception and simulation has investigated the contribution of physical stimulus properties (so-called “bottom-up” factors). This paper provides an overview of recent experiments demonstrating that illusory self-motion perception can also benefit from “top-down” mechanisms, e.g. expectations, the interpretation and meaning associated with the stimulus, and the resulting spatial presence in the simulated environment. METHODS: Several VR setups were used as a means to independently control different sensory modalities, thus allowing for well-controlled and reproducible psychophysical experiments. Illusory self-motion perception (vection) was induced using rotating visual or binaural auditory stimuli, presented via a curved projection screen (FOV: 54x40.5°) or headphones, respectively. Additional vibrations, subsonic sound, or cognitive frameworks were applied in some trials. Vection was quantified in terms of onset time, intensity, and convincingness ratings. RESULTS DISCUSSION: Auditory vection studies showed that sound sources participants associated with stationary “acoustic landmarks” (e.g., a fountain) can significantly increase the effectiveness of the self-motion illusion, as compared to sound sources that are typically associated to moving objects (like the sound of footsteps). A similar top-down effect was observed in a visual vection experiment: Showing a rotating naturalistic scene in VR improved vection considerably compared to scrambled versions of the same scene. Hence, the possibility to interpret the stimulus as a stationary reference frame seems to enhance the self-motion perception, which challenges the prevailing opinion that self-motion perception is primarily bottom-up driven. Even the mere knowledge that one might potentially be moved physically increased the convincingness of the self-motion illusion significantly, especially when additional vibrations supported the interpretation that one was really moving. CONCLUSIONS: Various topdown mechanisms were shown to increase the effectiveness of self-motion simulations in VR, even though they have received little attention in the literature up to now. Thus, we posit that a perceptually-oriented approach that combines both bottom-up and top-down factors will ultimately enable us to optimize self-motion simulations in terms of both effectiveness and costs.

Details

show
hide
Language(s):
 Dates: 2005-07
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: ISBN: 0-8058-5807-5
URI: http://www.hci-international.org/
BibTex Citekey: 2765
 Degree: -

Event

show
hide
Title: 11th International Conference on Human-Computer Interaction (HCI International 2005)
Place of Event: Las Vegas, NV, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: 11th International Conference on Human-Computer Interaction (HCI International 2005)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: Mahwah, NJ, USA : Erlbaum
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 1 - 10 Identifier: -