English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT

Released

Journal Article

Virtual-reality techniques resolve the visual cues used by fruit flies to evaluate object distances.

MPS-Authors
/persons/resource/persons274203

Schuster,  S
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Former Department Neurophysiology of Insect Behavior, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84975

Strauss,  R
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Former Department Neurophysiology of Insect Behavior, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84662

Götz,  KG
Max Planck Institute for Biological Cybernetics, Max Planck Society;
Former Department Neurophysiology of Insect Behavior, Max Planck Institute for Biological Cybernetics, Max Planck Society;

Fulltext (restricted access)
There are currently no full texts shared for your IP range.
Fulltext (public)
There are no public fulltexts stored in PuRe
Supplementary Material (public)
There is no public supplementary material available
Citation

Schuster, S., Strauss, R., & Götz, K. (2002). Virtual-reality techniques resolve the visual cues used by fruit flies to evaluate object distances. Current Biology, 12, 1591-1594. doi:10.1016/S0960-9822(02)01141-7.


Cite as: https://hdl.handle.net/11858/00-001M-0000-0013-E0DB-C
Abstract
Insects can estimate distance or time-to-contact of surrounding objects from locomotion-induced changes in their retinal position and/or size. Freely walking fruit flies (Drosophila melanogaster) use the received mixture of different distance cues to select the nearest objects for subsequent visits. Conventional methods of behavioral analysis fail to elucidate the underlying data extraction. Here we demonstrate first comprehensive solutions of this problem by substituting virtual for real objects; a tracker-controlled 360° panorama converts a fruit fly's changing coordinates into object illusions that require the perception of specific cues to appear at preselected distances up to infinity. An application reveals the following: (1) en-route sampling of retinal-image changes accounts for distance discrimination within a surprising range of at least 8-80 body lengths (20-200 mm). Stereopsis and peering are not involved. (2) Distance from image translation in the expected direction (motion parallax) outweighs distance from image expansion, which accounts for impact-avoiding flight reactions to looming objects. (3) The ability to discriminate distances is robust to artificially delayed updating of image translation. Fruit flies appear to interrelate self-motion and its visual feedback within a surprisingly long time window of about 2 s. The comparative distance inspection practiced in the small fruit fly deserves utilization in self-moving robots.