Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Buchkapitel

A Robot System for Biomimetic Navigation: From Snapshots to Metric Embeddings of View Graphs

MPG-Autoren
/persons/resource/persons83919

Franz,  MO
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

pdf2216.pdf
(beliebiger Volltext), 466KB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Franz, M., Stürzl, W., Hübner, W., & Mallot, H. (2008). A Robot System for Biomimetic Navigation: From Snapshots to Metric Embeddings of View Graphs. In M. Jefferies, & W.-K. Yeap (Eds.), Robotics and Cognitive Approaches to Spatial Mapping (pp. 297-314). Berlin, Germany: Springer.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-CAC5-8
Zusammenfassung
Complex navigation behaviour (way-finding) involves recognizing several places and encoding a spatial relationship between them. Way-finding skills can be classified into a hierarchy according to the complexity of the tasks that can be performed [8]. The most basic form of way-finding is route navigation, followed by topological navigation where several routes are integrated into a graph-like representation. The highest level, survey navigation, is reached when this graph can be embedded into a common reference frame.
In this chapter, we present the building blocks for a biomimetic robot navigation system that encompasses all levels of this hierarchy. As a local navigation method, we use scene-based homing. In this scheme, a goal location is characterized either by a panoramic snapshot of the light intensities as seen from the place, or by a record of the distances to the surrounding objects. The goal is found by moving in the direction that minimizes the discrepancy between the recorded intensities or distances and the current sensory input. For learning routes, the robot selects distinct views during exploration that are close enough to be reached by snapshot-based homing. When it encounters already visited places during route learning, it connects the routes and thus forms a topological representation of its environment termed a view graph. The final stage, survey navigation, is achieved by a graph embedding procedure which complements the topologic information of the view graph with odometric position estimates. Calculation of the graph embedding is done with a modified multidimensional scaling algorithm which makes use of distances and angles between nodes.