de.mpg.escidoc.pubman.appbase.FacesBean
Deutsch
 
Hilfe Wegweiser Datenschutzhinweis Impressum Kontakt
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Integrated real-time eye, head, and body tracking in front of a wall-sized display

MPG-Autoren
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83847

Canto-Pereira,  LH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84248

Tanner,  TG
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83966

Herholz,  S
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83913

Fleming,  RW
Research Group Computational Vision and Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine Externen Ressourcen verfügbar
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Canto-Pereira, L., Tanner, T., Herholz, S., Fleming, R., & Bülthoff, H. (2007). Integrated real-time eye, head, and body tracking in front of a wall-sized display. Poster presented at 30th European Conference on Visual Perception, Arezzo, Italy.


Zitierlink: http://hdl.handle.net/11858/00-001M-0000-0013-CC61-7
Zusammenfassung
Most devices for eye- or gaze-tracking require constrained head and body movements in order to achieve high temporal and spatial accuracy, or a limited field of view or observer positions (eg for keeping the eyes visible for video tracking). This may lead to unnatural viewing conditions, possibly systematically altering gaze patterns in experiments. Furthermore, head and eye movements often cannot be analyzed independently. We present a novel system integrating high-speed eye- and head-tracking, thus enabling observers to move freely in front of large (wall-sized) displays. The system is modular, making it easy to track additional markers for body parts or pointing devices, if desired. Tracking is performed by an Eyelink II (500 Hz) and three Vicon MX motion capture cameras (180 Hz, erroramp;lt;1 mm), respectively. Gaze direction (based on independent eye and head direction) are calculated in real-time (erroramp;lt;0.8°, latencyamp;lt;6 ms), thus al lowing gaze- contingent d isplays. We present possible applications of the system in psychophysics and data visualization.