Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Poster

Eye-movement planning during flight maneuvers

MPG-Autoren
/persons/resource/persons83861

Chuang,  L
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84111

Nieuwenhuizen,  F
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Chuang, L., Nieuwenhuizen, F., & Bülthoff, H. (2012). Eye-movement planning during flight maneuvers. Poster presented at 35th European Conference on Visual Perception, Alghero, Italy.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-B644-F
Zusammenfassung
How are eye-movements planned to access relevant visual information during flight control? From the cockpit perspective, there are two classes of visual information that are relevant for flight control. First, the changing visuals of the external world provide direct perceptual feedback on how the pilot's command of the control stick is affecting the aircraft's current position, orientation and velocity. Second, flight instruments provide abstracted and specific values—on factors such as the aircraft's compass bearing and vertical speed—that have to be continuously monitored, in order for the global objective of certain maneuvers (eg, turns) to be achieved. Trained pilots have to coordinate their eye-movements across this structured visual workspace (ie, outside view and instruments) to access timely and task-relevant information. The current work focuses on providing descriptions of these planned eye-movements. Eye-movements were recorded of pilots in a high-fidelity flight simulator (100° field-of-view) whilst they performed specific flight maneuvers. Fixation durations and transitions between the individual instruments and aspects of the external environment are represented as network graphs. This allowed us to formally describe the sources of information that were relied on across the different tasks and to compare actual performance to expert predictions.