de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Talk

Multisensory integration for the perception of self-motion

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83802

Berger,  DR
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Berger, D. (2005). Multisensory integration for the perception of self-motion. Talk presented at Institutskolloquium, Max-Planck-Institut für medizinische Forschung. Heidelberg, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-D501-7
Abstract
When we move in the environment, we perceive our position and motion in space with several senses. Among these are the visual sense and body senses of self-motion (vestibular and somatosensory). The human brain combines the information of the different senses to generate a unified and robust percept of self-motion. We investigated the integration process in human observers using psychophysical methods. Experiments were performed on a hexapod platform with a projection screen, which allows the presentation of realistic movements during which visual cues and body cues for self-motion can be manipulated independently. I will present a series of experiments in which we studied the multimodal perception of whole-body rotations around an earth-vertical axis (yaw rotations). In particular, we tested whether the integration of visual and body cues of self-motion follows the mathematically optimal maximum likelihood integration principle. We also investigated how the influence of visual and body cues on the perception of yaw rotations depends on focusing attention to either cue, and on becoming aware of conflicts between the two modalities.