de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Book Chapter

Multimodal Integration during Self-Motion in Virtual Reality

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84378

Campos,  JL
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Campos, J., & Bülthoff, H. (2012). Multimodal Integration during Self-Motion in Virtual Reality. In M. Murray, & M. Wallace (Eds.), The neural bases of multisensory processes (pp. 603-628). Boca Raton, FL, USA: CRC Press.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B892-D
Abstract
This chapter begins by a brief description of some of the different types of simulation tools and techniques that are being used to study self-motion perception, along with some of the advantages and disadvantages of the different interfaces. Subsequently, some of the current empirical work investigating multisensory self-motion perception using these technologies will be summarized, focusing mainly on visual, proprioceptive, and vestibular influences during full-body self-motion through space. Finally, the implications of this research for several applied areas will be briefly described.