de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Poster

Eye-movement planning during flight maneuvers

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83861

Chuang,  L
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84111

Nieuwenhuizen,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Chuang, L., Nieuwenhuizen, F., & Bülthoff, H. (2012). Eye-movement planning during flight maneuvers. Poster presented at 35th European Conference on Visual Perception, Alghero, Italy.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-B644-F
Abstract
How are eye-movements planned to access relevant visual information during flight control? From the cockpit perspective, there are two classes of visual information that are relevant for flight control. First, the changing visuals of the external world provide direct perceptual feedback on how the pilot's command of the control stick is affecting the aircraft's current position, orientation and velocity. Second, flight instruments provide abstracted and specific values—on factors such as the aircraft's compass bearing and vertical speed—that have to be continuously monitored, in order for the global objective of certain maneuvers (eg, turns) to be achieved. Trained pilots have to coordinate their eye-movements across this structured visual workspace (ie, outside view and instruments) to access timely and task-relevant information. The current work focuses on providing descriptions of these planned eye-movements. Eye-movements were recorded of pilots in a high-fidelity flight simulator (100° field-of-view) whilst they performed specific flight maneuvers. Fixation durations and transitions between the individual instruments and aspects of the external environment are represented as network graphs. This allowed us to formally describe the sources of information that were relied on across the different tasks and to compare actual performance to expert predictions.