Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Konferenzbeitrag

On the Representation, Learning and Transfer of Spatio-Temporal Movement Characteristics

MPG-Autoren
/persons/resource/persons85005

Ilg,  W
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83791

Bakir,  GH
Department Empirical Inference, Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84787

Giese,  MA
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Ilg, W., Bakir, G., Mezger, J., & Giese, M. (2003). On the Representation, Learning and Transfer of Spatio-Temporal Movement Characteristics. In Humanoids Proceedings (pp. 0-0).


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-DC27-C
Zusammenfassung
In this paper we present a learning-based approach for the modelling of complex movement sequences. Based on the method of Spatio-Temporal Morphable Models (STMMS. We derive a hierarchical algorithm that, in a first step, identifies automatically movement elements in movement sequences based on a coarse spatio-temporal description, and in a second step models these movement primitives by approximation through linear combinations of learned example movement trajectories. We describe the different steps of the algorithm and show how it can be applied for modelling and synthesis of complex sequences of human movements that contain movement elements with variable style. The proposed method is demonstrated on different applications of movement representation relevant for imitation learning of movement styles in humanoid robotics.