非表示:
キーワード:
-
要旨:
When we move in the environment, we perceive our position and motion in space with several senses. Among these are the visual sense and body senses of self-motion (vestibular and somatosensory). The human brain combines the information of the different senses to generate a unified and robust percept of self-motion.
We investigated the integration process in human observers using psychophysical methods. Experiments were performed on a hexapod platform with a projection screen, which allows the presentation of realistic movements during which visual cues and body cues for self-motion can be manipulated independently.
I will present a series of experiments in which we studied the multimodal perception of whole-body rotations around an earth-vertical axis (yaw rotations). In particular, we tested whether the integration of visual and body cues of self-motion follows the mathematically optimal maximum likelihood integration principle. We also investigated how the influence of visual and body cues on the perception of yaw rotations depends on focusing attention to either cue, and on becoming aware of conflicts between the two modalities.