日本語
 
Help Privacy Policy ポリシー/免責事項
  詳細検索ブラウズ

アイテム詳細


公開

会議論文

Development of a virtual laboratory for the study of complex human behavior

MPS-Authors
/persons/resource/persons84287

von der Heyde,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
フルテキスト (公開)
公開されているフルテキストはありません
付随資料 (公開)
There is no public supplementary material available
引用

Pelz, J., Hayhoe, M., Ballard, D., Shrivastava, A., Bayliss, J., & von der Heyde, M. (1999). Development of a virtual laboratory for the study of complex human behavior. In J., Merritt, M., Bolas, & S., Fisher (Eds.), Stereoscopic Displays and Virtual Reality Systems VI (pp. 416-426). Bellingham, WA, USA: SPIE.


引用: https://hdl.handle.net/11858/00-001M-0000-0013-E757-9
要旨
The study of human perception has evolved from examining simple tasks executed in reduced laboratory conditions to the examination of complex, real-world behaviors. Virtual environments represent the next evolutionary step by allowing full stimulus control and repeatability for human subjects, and a testbed for evaluating models of human behavior. Visual resolution varies dramatically across the visual field, dropping orders of magnitude from central to
peripheral vision. Humans move their gaze about a scene several times every second, projecting taskcritical areas of the scene onto the central retina. These eye movements are made even when the immediate task does not require high spatial resolution. Such “attentionally-driven” eye movements are important because they provide an externally observable marker of the way subjects deploy their attention while performing complex, real-world tasks. Tracking subjects’ eye movements while they perform complex
tasks in virtual environments provides a window into perception. In addition to the ability to track subjects’
eyes in virtual environments, concurrent EEG recording provides a further indicator of cognitive state. We have developed a virtual reality laboratory in which head-mounted displays (HMDs) are instrumented with infrared video-based eyetrackers to monitor subjects’ eye movements while they perform a range of complex tasks such as driving, and manual tasks requiring careful eye-hand coordination. A go-kart mounted on a 6DOF motion platform provides kinesthetic feedback to subjects as they drive through a virtual town; a dual-haptic interface consisting of two SensAble Phantom extended range devices allows
free motion and realistic force-feedback within a 1 m3 volume.