de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

Multimodal Integration during Self-Motion in Virtual Reality

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84378

Campos,  JL
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Campos, J., & Bülthoff, H. (2012). Multimodal Integration during Self-Motion in Virtual Reality. In M. Murray, & M. Wallace (Eds.), The neural bases of multisensory processes (pp. 603-628). Boca Raton, FL, USA: CRC Press.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-B892-D
Zusammenfassung
This chapter begins by a brief description of some of the different types of simulation tools and techniques that are being used to study self-motion perception, along with some of the advantages and disadvantages of the different interfaces. Subsequently, some of the current empirical work investigating multisensory self-motion perception using these technologies will be summarized, focusing mainly on visual, proprioceptive, and vestibular influences during full-body self-motion through space. Finally, the implications of this research for several applied areas will be briefly described.