de.mpg.escidoc.pubman.appbase.FacesBean
English
 
Help Guide Disclaimer Contact us Login
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Conference Paper

The role of visual cues and whole-body rotations in helicopter hovering control

MPS-Authors
http://pubman.mpdl.mpg.de/cone/persons/resource/persons83802

Berger,  DR
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons84253

Terzibas,  C
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83808

Beykirch,  K
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

http://pubman.mpdl.mpg.de/cone/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Locator
There are no locators available
Fulltext (public)
There are no public fulltexts available
Supplementary Material (public)
There is no public supplementary material available
Citation

Berger, D., Terzibas, C., Beykirch, K., & Bülthoff, H. (2007). The role of visual cues and whole-body rotations in helicopter hovering control. In AIAA Modeling and Simulation Technologies Conference and Exhibit 2007 (pp. 1-13). Reston, VA, USA: American Institute of Aeronautics and Astronautics.


Cite as: http://hdl.handle.net/11858/00-001M-0000-0013-CC49-1
Abstract
Helicopters in flight are unstable, much like an inverse pendulum, and hovering at one spot requires the pilot to do a considerable amount of active control. To date, it is still under discussion which sensory cues helicopter pilots use for this stabilization task, and how these cues are combined. There are several sensory cues a pilot might use for stabilizing a helicopter (hovering at a target spot). The horizon provides visual information for the orientation of the helicopter in pitch and roll. Optic flow provided by movement of visual features in the observer’s view during self-motion can tell the observer about translations and rotations. Apart from vision, pilots can also use force cues of self-motion. Rotations and accelerations of the head can be detected by the vestibular system in the inner ear, and body accelerations are measured by pressure sensors in the skin and by proprioceptive sensors. Here we investigated how cues from different sensory modalities (visual cues and body cues) are used when humans stabilize a simulated helicopter at a target location in a closed perception-action loop.