Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Motion Capture and Imitation Learning-based Approach to Robot Control

Version 1 : Received: 29 June 2022 / Approved: 30 June 2022 / Online: 30 June 2022 (11:15:14 CEST)

A peer-reviewed article of this Preprint also exists.

Racinskis, P.; Arents, J.; Greitans, M. A Motion Capture and Imitation Learning Based Approach to Robot Control. Applied Sciences, 2022, 12, 7186. https://doi.org/10.3390/app12147186. Racinskis, P.; Arents, J.; Greitans, M. A Motion Capture and Imitation Learning Based Approach to Robot Control. Applied Sciences, 2022, 12, 7186. https://doi.org/10.3390/app12147186.

Abstract

Imitation Learning is a discipline of Machine Learning primarily concerned with replicating observed behavior of agents known to perform well on a given task, collected in demonstration data sets. In this paper, we set out to introduce a pipeline for collecting demonstrations and training models that can produce motion plans for industrial robots. Object throwing is defined as the motivating use case. Multiple input data modalities are surveyed, and motion capture is selected as the most practicable. Two model architectures operating autoregressively are examined -- feedforward and recurrent neural networks. Trained models execute throws on a real robot successfully, and a battery of quantitative evaluation metrics is proposed, including extrapolated throw accuracy estimates. Recurrent neural networks outperform feedforward ones in most respects, with the best models having an assessed mean throw error on the order of 0.1...0.2 m at distances of 1.5...2.0 m. The data collection, pre-processing, and model training aspects of our proposed approach show promise, but further work is required in developing Cartesian motion planning tools before it is suitable for application in production.

Supplementary and Associated Material

https://github.com/peteris-racinskis/data_pub_motion_capture_imitiation_learning: Demonstration data sets; trajectories generated for evaluation; tables of computed metrics. Everything is .csv. Some additional data generated in working with model architectures not covered in the paper is also present for future use.

Keywords

imitation learning; motion capture; robotics; artificial neural networks; RNN

Subject

Computer Science and Mathematics, Robotics

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.