日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細

  Spatial updating in real and virtual environments: contribution and interaction of visual and vestibular cues

Riecke, B., & Bülthoff, H. (2004). Spatial updating in real and virtual environments: contribution and interaction of visual and vestibular cues. In V., Interrante, A., McNamara, H., Bülthoff, & H., Rushmeier (Eds.), APGV '04: 1st Symposium on Applied Perception in Graphics and Visualization (pp. 9-17). New York, NY, USA: ACM Press.

Item is

基本情報

表示: 非表示:
資料種別: 会議論文

ファイル

表示: ファイル

関連URL

表示:
非表示:
説明:
-
OA-Status:

作成者

表示:
非表示:
 作成者:
Riecke, BE1, 2, 著者           
Bülthoff, HH1, 2, 著者           
所属:
1Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497797              
2Max Planck Institute for Biological Cybernetics, Max Planck Society, Spemannstrasse 38, 72076 Tübingen, DE, ou_1497794              

内容説明

表示:
非表示:
キーワード: -
 要旨: INTRODUCTION: When we move through the environment, the self-to-surround relations constantly change. Nevertheless, we perceive the world as stable. A process that is critical to this perceived stability is "spatial updating", which automatically updates our egocentric mental spatial representation of the surround according to our current self-motion. According to the prevailing opinion, vestibular and proprioceptive cues are absolutely required for spatial updating. Here, we challenge this notion by varying visual and vestibular contributions independently in a high-fidelity VR setup. METHODS: In a learning phase, participants learned the positions of twelve targets attached to the walls of a 5x5m room. In the testing phase, participants saw either the real room or a photo-realistic copy presented via a head-mounted display (HMD). Vestibular cues were applied using a motion platform. Participants' task was to point "as accurately and quickly as possible" to four targets announced consecutively via headphones after rotations around the vertical axis into different positions. RESULTS: Automatic spatial updating was observed whenever useful visual information was available: Paticipants had no problem mentally updating their orientation in space, irrespective of turning angle. Performance, quantified as response time, configuration error, and pointing error, was best in the real world condition. However, when the field of view was limited via cardboard blinders to match that of the HMD (40 × 30°), performance decreased and was comparable to the HMD condition. Presenting turning information only visually (through the HMD) hardly altered those results. In both the real world and HMD conditions, spatial updating was obligatory in the sense that it was significantly more difficult to ignore ego-turns (i.e., "point as if not having turned") than to update them as usual. CONCLUSION: The rapid pointing paradigm proved to be a useful tool for quantifying spatial updating. We conclude that, at least for the limited turning angles used (<60°), the Virtual Reality simulation of ego-rotation was as effective and convincing (i.e., hard to ignore) as its real world counterpart, even when only visual information was presented. This has relevant implications for the design of motion simulators for, e.g., architecture walkthroughs.

資料詳細

表示:
非表示:
言語:
 日付: 2004-08
 出版の状態: 出版
 ページ: -
 出版情報: -
 目次: -
 査読: -
 識別子(DOI, ISBNなど): DOI: 10.1145/1012551.1012553
BibTex参照ID: 2764
 学位: -

関連イベント

表示:
非表示:
イベント名: 1st Symposium on Applied Perception in Graphics and Visualization (APGV 2004)
開催地: Los Angeles, California, USA
開始日・終了日: 2004-08-07 - 2004-08-08

訴訟

表示:

Project information

表示:

出版物 1

表示:
非表示:
出版物名: APGV '04: 1st Symposium on Applied Perception in Graphics and Visualization
種別: 会議論文集
 著者・編者:
Interrante, V, 編集者
McNamara, A, 編集者
Bülthoff, HH1, 編集者           
Rushmeier, H, 編集者
所属:
1 Max Planck Institute for Biological Cybernetics, Max Planck Society, ou_1497794            
出版社, 出版地: New York, NY, USA : ACM Press
ページ: - 巻号: - 通巻号: - 開始・終了ページ: 9 - 17 識別子(ISBN, ISSN, DOIなど): ISBN: 1-58113-914-4