English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Moving sounds enhance the visually-induced self-motion illusion (circular vection) in virtual reality

MPS-Authors
/persons/resource/persons84170

Riecke,  BE
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84199

Schulte-Pelkum,  J
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

External Resource
Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Riecke, B., Väljamäe, A., & Schulte-Pelkum, J. (2009). Moving sounds enhance the visually-induced self-motion illusion (circular vection) in virtual reality. ACM Transactions on Applied Perception, 6(2): 7, pp. 1-27. doi:10.1145/1498700.1498701.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-C5CB-A
Abstract
While rotating visual and auditory stimuli have long been known to elicit self-motion illusions (“circular vection”), audiovisual interactions have hardly been investigated. Here, two experiments investigated whether visually induced circular vection can be enhanced by concurrently rotating auditory cues that match visual landmarks (e.g., a fountain sound). Participants sat behind a curved projection screen displaying rotating panoramic renderings of a market place. Apart from a no-sound condition, headphone-based auditory stimuli consisted of mono sound, ambient sound, or low-/high-spatial resolution auralizations using generic head-related transfer functions (HRTFs). While merely adding nonrotating (mono or ambient) sound showed no effects, moving sound stimuli facilitated both vection and presence in the virtual environment. This spatialization benefit was maximal for a medium (20 × 15) FOV, reduced for a larger (54 × 45) FOV and unexpectedly absent for the smallest (10 × 7.5) FOV. Increasing auraliza
tion spatial fidelity (from low, comparable to five-channel home theatre systems, to high, 5 resolution) provided no further benefit, suggesting a ceiling effect. In conclusion, both self-motion perception and presence can benefit from adding moving auditory stimuli. This has important implications both for multimodal cue integration theories and the applied challenge of building affordable yet effective motion simulators.