de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

Effect of the size of the field of view on the perceived amplitude of rotations of the visual scene

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons84116

Ogier,  M
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84088

Mohler,  BJ
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83831

Bresciani,  J-P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Ogier, M., Mohler, B., Bülthoff, H., & Bresciani, J.-P. (2008). Effect of the size of the field of view on the perceived amplitude of rotations of the visual scene. In 14th Eurographics Symposium on Virtual Environments (EGVE 2008) (pp. 97-102). Aire-la-Ville, Switzerland: Eurographics Association.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-C97B-9
Zusammenfassung
Efficient navigation requires a good representation of body position/orientation in the environment and an accurate updating of this representation when the body-environment relationship changes. We tested here whether the visual flow alone - i.e., no landmark - can be used to update this representation when the visual scene is rotated, and whether having a limited horizontal field of view (30 or 60 degrees), as it is the case in most virtual reality applications, degrades the performance as compared to a full field of view. Our results show that (i) the visual flow alone does not allow for accurately estimating the amplitude of rotations of the visual scene, notably giving rise to a systematic underestimation of rotations larger than 30 degrees, and (ii) having more than 30 degrees of horizontal field of view does not really improve the performance. Taken together, these results suggest that a 30 degree field of view is enough to (under)estimate the amplitude of visual rotations when only visual flow information is available, and that landmarks should probably be provided if the amplitude of the rotations has to be accurately perceived.