de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Impressum Kontakt Einloggen
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

Egocentric distance perception in large screen immersive displays

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83780

Piryankova,  IV
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83877

de la Rosa,  S
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Kloos U, Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84088

Mohler,  BJ
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Piryankova, I., de la Rosa, S., Kloos U, Bülthoff, H., & Mohler, B. (2013). Egocentric distance perception in large screen immersive displays. Displays, Epub ahead. doi:10.1016/j.displa.2013.01.001.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-B4F8-9
Zusammenfassung
Many scientists have demonstrated that compared to the real world egocentric distances in head-mounted display virtual environments are underestimated. However, distance perception in large screen immersive displays has received less attention. We investigate egocentric distance perception in a virtual office room projected using a semi-spherical, a Max Planck Institute CyberMotion Simulator cabin and a flat large screen immersive display. The goal of our research is to systematically investigate distance perception in large screen immersive displays with commonly used technical specifications. We specifically investigate the role of distance to the target, stereoscopic projection and motion parallax on distance perception. We use verbal reports and blind walking as response measures for the real world experiment. Due to the limited space in the three large screen immersive displays we use only verbal reports as the response measure for the experiments in the virtual environment. Our results show an overall underestimation of distance perception in the large screen immersive displays, while verbal estimates of distances are nearly veridical in the real world. We find that even when providing motion parallax and stereoscopic depth cues to the observer in the flat large screen immersive display, participants estimate the distances to be smaller than intended. Although stereo cues in the flat large screen immersive display do increase distance estimates for the nearest distance, the impact of the stereoscopic depth cues is not enough to result in veridical distance perception. Further, we demonstrate that the distance to the target significantly influences the percent error of verbal estimates in both the real and virtual world. The impact of the distance to the target on the distance judgments is the same in the real world and in two of the used large screen displays, namely, the MPI CyberMotion Simulator cabin and the flat displays. However, in the semi-spherical display we observe a significantly different influence of distance to the target on verbal estimates of egocentric distances. Finally, we discuss potential reasons for our results. Based on the findings from our research we give general suggestions that could serve as methods for improving the LSIDs in terms of the accuracy of depth perception and suggest methods to compensate for the underestimation of verbal distance estimates in large screen immersive displays.