Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

Multimodal Integration during Self-Motion in Virtual Reality

MPG-Autoren
/persons/resource/persons84378

Campos,  JL
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Campos, J., & Bülthoff, H. (2012). Multimodal Integration during Self-Motion in Virtual Reality. In M. Murray, & M. Wallace (Eds.), The neural bases of multisensory processes (pp. 603-628). Boca Raton, FL, USA: CRC Press.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-B892-D
Zusammenfassung
This chapter begins by a brief description of some of the different types of simulation tools and techniques that are being used to study self-motion perception, along with some of the advantages and disadvantages of the different interfaces. Subsequently, some of the current empirical work investigating multisensory self-motion perception using these technologies will be summarized, focusing mainly on visual, proprioceptive, and vestibular influences during full-body self-motion through space. Finally, the implications of this research for several applied areas will be briefly described.