de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Real-time gaze-tracking for freely-moving observers

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83966

Herholz,  S
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84248

Tanner,  TG
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83847

Canto-Pereira,  LH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83913

Fleming,  RW
Research Group Computational Vision and Neuroscience, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Herholz, S., Tanner, T., Canto-Pereira, L., Fleming, R., & Bülthoff, H. (2007). Real-time gaze-tracking for freely-moving observers. Poster presented at 14th European Conference on Eye Movements (ECEM 2007), Potsdam, Germany.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-CC6D-F
Abstract
We have developed a real-time mobile gaze-tracker, by combining a high-speed eye-tracker (Eyelink II, 500Hz) with head- and body-tracking (VICON, 200Hz). The position of the observer’s gaze on the screen can be measured continuously with an accuracy of <1.0 deg as they walk around and make head movements in a natural way. The system is modular, i.e. individual components can be easily replaced (e.g., different eye and head tracking systems can be used). The system is primarily developed for interaction in front of wall-sized displays. For validation, the system has been tested with displays of different sizes (from 2.2x1.8m to 5.2x2.5m), and several applications, including psychophysical experiments and a multiresolution gaze-contingent display.