English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  Spatial updating in real and virtual environments: contribution and interaction of visual and vestibular cues

Riecke, B., & Bülthoff, H. (2004). Spatial updating in real and virtual environments: contribution and interaction of visual and vestibular cues. In 1st Symposium on Applied Perception in Graphics and Visualization (APGV 2004) (pp. 9-17). New York, NY, USA: ACM Press.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Riecke, BE1, Author           
Bülthoff, HH1, Author           
Interrante, Editor
V., Editor
McNamara, A., Editor
Bülthoff, H.H., Editor
Rushmeier, H.E., Editor
Affiliations:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              

Content

show
hide
Free keywords: -
 Abstract: INTRODUCTION: When we move through the environment, the self-to-surround relations constantly change. Nevertheless, we perceive the world as stable. A process that is critical to this perceived stability is "spatial updating", which automatically updates our egocentric mental spatial representation of the surround according to our current self-motion. According to the prevailing opinion, vestibular and proprioceptive cues are absolutely required for spatial updating. Here, we challenge this notion by varying visual and vestibular contributions independently in a high-fidelity VR setup. METHODS: In a learning phase, participants learned the positions of twelve targets attached to the walls of a 5x5m room. In the testing phase, participants saw either the real room or a photo-realistic copy presented via a head-mounted display (HMD). Vestibular cues were applied using a motion platform. Participants' task was to point "as accurately and quickly as possible" to four targets announced consecutively via headphones after rotations around the vertical axis into different positions. RESULTS: Automatic spatial updating was observed whenever useful visual information was available: Paticipants had no problem mentally updating their orientation in space, irrespective of turning angle. Performance, quantified as response time, configuration error, and pointing error, was best in the real world condition. However, when the field of view was limited via cardboard blinders to match that of the HMD (40 × 30°), performance decreased and was comparable to the HMD condition. Presenting turning information only visually (through the HMD) hardly altered those results. In both the real world and HMD conditions, spatial updating was obligatory in the sense that it was significantly more difficult to ignore ego-turns (i.e., "point as if not having turned") than to update them as usual. CONCLUSION: The rapid pointing paradigm proved to be a useful tool for quantifying spatial updating. We conclude that, at least for the limited turning angles used (<60°), the Virtual Reality simulation of ego-rotation was as effective and convincing (i.e., hard to ignore) as its real world counterpart, even when only visual information was presented. This has relevant implications for the design of motion simulators for, e.g., architecture walkthroughs.

Details

show
hide
Language(s):
 Dates: 2004-08
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: ISBN: 1-58113-914-4
URI: http://portal.acm.org/citation.cfm?id=1012553
DOI: 10.1145/1012551.1012553
BibTex Citekey: 2764
 Degree: -

Event

show
hide
Title: 1st Symposium on Applied Perception in Graphics and Visualization (APGV 2004)
Place of Event: Los Angeles, California, USA
Start-/End Date: -

Legal Case

show

Project information

show

Source 1

show
hide
Title: 1st Symposium on Applied Perception in Graphics and Visualization (APGV 2004)
Source Genre: Proceedings
 Creator(s):
Affiliations:
Publ. Info: New York, NY, USA : ACM Press
Pages: - Volume / Issue: - Sequence Number: - Start / End Page: 9 - 17 Identifier: -