Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Engaging with Robotic Swarms: Commands from Expressive Motion

Published: 08 June 2019 Publication History

Abstract

In recent years, researchers have explored human body posture and motion to control robots in more natural ways. These interfaces require the ability to track the body movements of the user in three dimensions. Deploying motion capture systems for tracking tends to be costly and intrusive and requires a clear line of sight, making them ill adapted for applications that need fast deployment. In this article, we use consumer-grade armbands, capturing orientation information and muscle activity, to interact with a robotic system through a state machine controlled by a body motion classifier. To compensate for the low quality of the information of these sensors, and to allow a wider range of dynamic control, our approach relies on machine learning. We train our classifier directly on the user to recognize (within minutes) which physiological state his or her body motion expresses. We demonstrate that on top of guaranteeing faster field deployment, our algorithm performs better than all comparable algorithms, and we detail its configuration and the most significant features extracted. As the use of large groups of robots is growing, we postulate that their interaction with humans can be eased by our approach. We identified the key factors to stimulate engagement using our system on 27 participants, each creating his or her own set of expressive motions to control a swarm of desk robots. The resulting unique dataset is available online together with the classifier and the robot control scripts.

Supplementary Material

st-onge (st-onge.zip)
Supplemental movie, appendix, image and software files for, Engaging with Robotic Swarms: Commands from Expressive Motion

References

[1]
J. Aggarwal and Q. Cai. 1999. Human motion analysis: A review. Comput. Vis. Image Understand. 73, 3 (1999), 428--440.
[2]
J.-H. Ahn, C. Choi, S. Kwak, K. Kim, and H. Byun. 2009. Human tracking and silhouette extraction for human--robot interaction systems. Pattern Anal. Appl. 12, 2 (2009), 167--177.
[3]
P. K. Artemiadis and K. J. Kyriakopoulos. 2011. A switching regime model for the EMG-based control of a robot arm. IEEE Trans. Syst. Man Cybernet. B 41, 1 (Feb. 2011).
[4]
L. A. Blom and L. T. Chaplin. 1982. The Intimate Act of Choreography. University of Pittsburgh Press.
[5]
Daniel Sundquist Brown, Michael A. Goodrich, Shin-Young Jung, and Sean C. Kerman. 2015. Two invariants of human swarm interaction. J. Hum.-Robot Interact. 5, 1 (2015), 1.
[6]
N. Bu, M. Okamoto, and T. Tsuji. 2009. A hybrid motion classification approach for EMG-based human--robot interfaces using Bayesian and neural networks. IEEE Trans. Robot. 25, 3 (Jun. 2009).
[7]
James Carifio and Rocco J. Perla. 2007. Ten common misunderstandings, misconceptions, persistent myths and urban legends about likert scales and likert response formats and their antidotes. J. Soc. Sci. 3, 3 (2007), 106--116.
[8]
Jorge Cortés, Sonia Martínez, Timur Karataş, and Francesco Bullo. 2004. Coverage control for mobile sensing networks. IEEE Trans. Robot. Automat. 20, 2 (2004), 243--255.
[9]
Sarah Cosentino, Klaus Petersen, Z. H. Lin, Luca Bartolomeo, Salvatore Sessa, Massimiliano Zecca, and Atsuo Takanishi. 2014. Natural human-robot musical interaction: Understanding the music conductor gestures by using the WB-4 inertial measurement system. Adv. Robot. 28, 11 (2014), 781--792.
[10]
Ulysse Côté-Allard, Cheikh Latyr Fall, Alexandre Drouin, Alexandre Campeau-Lecours, Clément Gosselin, Kyrre Glette, François Laviolette, and Benoit Gosselin. 2019. Deep learning for electromyographic hand gesture signal classification using transfer learning. IEEE Trans. Neur. Syst. Rehabil. Eng. (2019). https://ieeexplore.ieee.org/document/8630679.
[11]
Ulysse Côté-Allard, François Nougarou, Cheikh Latyr Fall, Philippe Giguère, Clément Gosselin, François Laviolette, and Benoit Gosselin. 2016. A convolutional neural network for robotic arm guidance using sEMG based frequency-features. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’16). IEEE, 2464--2470.
[12]
Angela Loureiro de Souza. 2016. Laban Movement Analysis—Scaffolding Human Movement to Multiply Possibilities and Choices. Springer International Publishing, Cham, 283--297.
[13]
Griffin Dietz, Jane L. E., Peter Washington, Lawrence H. Kim, and Sean Follmer. 2017. Human perception of swarm robot motion. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’17). 2520--2527.
[14]
K. DoHyung, L. Jaeyeon, Y. Ho-Sub, K. Jaehong, and S. Joochan. 2013. Vision-based arm gesture recognition for a long-range human--robot interaction. J. Supercomput. 65, 1 (2013), 336--352.
[15]
Paul Ekman. 1992. Are there basic emotions? Psychological Review 99, 3 (1992), 550--553.
[16]
Dominik Endres, Enrico Chiovetto, and Martin A. Giese. 2016. Bayesian Approaches for Learning of Primitive-Based Compact Representations of Complex Human Activities. Springer International Publishing, Cham, 117--137.
[17]
Ramon A. Suarez Fernandez, Jose Luis Sanchez-lopez, Carlos Sampedro, Hriday Bavle, Martin Molina, and Pascual Campoy. 2016. Natural user interfaces for human-drone multi-modal interaction. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS’16). 1013--1022.
[18]
Yarin Gal and Zoubin Ghahramani. 2016. Dropout as a Bayesian approximation: Representing model uncertainty in deep learning. In Proceedings of the International Conference on Machine Learning. 1050--1059.
[19]
Melvin Gauci, Jianing Chen, Tony J. Dodd, and Roderich Groß. 2014. Evolving aggregation behaviors in multi-robot systems with binary sensors. In Distributed Autonomous Robotic Systems, Vol. 104. 355--367.
[20]
P. Giguere and G. Dudek. 2011. A simple tactile probe for surface identification by mobile robots. IEEE Trans. Robot. 27, 3 (2011), 534--544.
[21]
Mathieu Le Goc, Lawrence H. Kim, Ali Parsaei, Jean-daniel Fekete, Pierre Dragicevic, and Sean Follmer. 2016. Zooids: Building blocks for swarm user interfaces. In Proceedings of the ACM User Interface Software and Technology Symposium (UIST’16).
[22]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2015. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE International Conference on Computer Vision. 1026--1034.
[23]
R. Herrera-Acuña, V. Argyriou, and S. A. Velastin. 2015. A Kinect-based 3D hand-gesture interface for 3D databases. J. Multimodal User Interfaces 9, 2 (Jun. 2015), 121--139.
[24]
Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neur. Comput. 9, 8 (1997), 1735--1780.
[25]
Sergey Ioffe and Christian Szegedy. 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning. 448--456.
[26]
Wenchao Jiang and Zhaozheng Yin. 2015. Human activity recognition using wearable sensors by deep convolutional neural networks. In Proceedings of the 23rd ACM International Conference on Multimedia. ACM, 1307--1310.
[27]
Lawrence H. Kim and Sean Follmer. 2017. UbiSwarm: Ubiquitous robotic interfaces and investigation of abstract motion as a display. Proc. ACM Interact. Mob. Wear. Ubiq. Technol. 1, 3 (2017), 66:1--66:20.
[28]
Diederik Kingma and Jimmy Ba. 2015. Adam: A method for stochastic optimization. The 3rd International Conference for Learning Representations. http://arxiv.org/abs/1412.6980.
[29]
Andreas Kolling, Phillip Walker, Nilanjan Chakraborty, Katia Sycara, and Michael Lewis. 2016. Human interaction with robot swarms: A survey. IEEE Trans. Hum.-Mach. Syst. 46, 1 (2016), 9--26.
[30]
Masao Kubo, Hiroshi Sato, Tatsuro Yoshimura, Akihiro Yamaguchi, and Takahiro Tanaka. 2014. Multiple targets enclosure by robotic swarm. Robot. Auton. Syst. 62, 9 (2014), 1294--1304.
[31]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521, 7553 (2015), 436.
[32]
Florent Levillain, David St-Onge, Elisabetta Zibetti, and Giovanni Beltrame. 2018. More than the sum of its parts: Assessing the coherence and expressivity of a robotic swarm. In Proceedings of the IEEE International Conference on Robot and Human Interactive Communication. Nanjing, 583--588.
[33]
Florent Levillain, Elisabetta Zibetti, and Sébastien Lefort. 2017. Interacting with non-anthropomorphic robotic artworks and interpreting their behaviour. Int. J. Soc. Robot. 9, 1 (2017), 141--161.
[34]
M. V. Liarokapi, P. K. Artemiadis, and K. J. Kyriakopoulos. 2013. Task discrimination from myoelectric activity: A learning scheme for EMG-based interfaces. In Proceedings of the IEEE International Conference on Rehabilitation Robotics. 1--6.
[35]
Xilin Liu, Milin Zhang, Andrew Richardson, Timothy Lucas, and Jan Van Der Spiegel. 2016. The virtual trackpad: An electromyography-based, wireless, real-time, low-power, embedded hand gesture recognition system using an event-driven artificial neural network. IEEE Trans. Circ. Syst. II: Expr. Briefs 64, 11 (2016), 1257--1261.
[36]
Samuel J. McDonald, Mark B. Colton, C. Kristopher Alder, and Michael A. Goodrich. 2017. Haptic shape-based management of robot teams in cordon and patrol. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI’17). 380--388.
[37]
Luca Mottola, Mattia Moretta, Kamin Whitehouse, and Carlo Ghezzi. 2014. Team-level programming of drone sensor networks. In Proceedings of the 12th ACM Conference on Embedded Network Sensor SystemsSystems (SenSys’14). 177--190.
[38]
Kristian Nymoen, Mari Romarheim Haugen, and Alexander Refsum Jensenius. 2015. MuMYO—Evaluating and exploring the MYO armband for musical interaction. In Proceedings of the International Conference on New Interfaces for Musical Expression. 215--218.
[39]
Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan, Edward Yang, Zachary DeVito, Zeming Lin, Alban Desmaison, Luca Antiga, and Adam Lerer. 2017. Automatic differentiation in PyTorch. In Proceedings of the Annual Conference on Neural Information Processing Systems (NIPS-W’17).
[40]
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, and et al. 2011. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 12 (2011), 2825--2830.
[41]
Jacques Penders, Lyuba Alboul, Ulf Witkowski, Amir Naghsh, Joan Saez-Pons, Stefan Herbrechtsmeier, and Mohamed El-Habbal. 2011. A robot swarm assisting a human fire-fighter. Adv. Robot. 25, 1--2 (2011), 93--117.
[42]
Brian Pendleton and Michael Goodrich. 2013. Scalable human interaction with robotic swarms. In Proceedings of the AIAA Infotech@Aerospace Conference (I@A’13). 1--13.
[43]
Klaus Petersen, Jorge Solis, and Atsuo Takanishi. 2010. Musical-based interaction system for the Waseda Flutist Robot: Implementation of the visual tracking interaction module. Auton. Robots 28, 4 (2010), 471--488.
[44]
A. Phinyomark, S. Hirunviriya, C. Limsakul, and P. Phukpattaranont. 2010. Evaluation of EMG feature extraction for hand movement recognition based on Euclidean distance and standard deviation. In Proceedings of the IEEE International Conference on Computer Telecommunications and Information Technology.
[45]
A. Phinyomark, C. Limsakul, and P. Phukpattaranont. 2009. A novel feature extraction for robust EMG pattern recognition. J. Comput. 1, 1 (2009), 71--80.
[46]
Carlo Pinciroli and Giovanni Beltrame. 2016. Swarm-oriented programming of distributed robot networks. Computer 49, 12 (2016), 32--41.
[47]
Carlo Pinciroli, Andrea Gasparri, Emanuele Garone, and Giovanni Beltrame. 2016. Decentralized progressive shape formation with robot swarms. In Proceedings of the 13th International Symposium on Distributed Autonomous Robotic Systems (DARS’16). 1--13.
[48]
Gaëtan Podevijn, Rehan O’Grady, Nithin Mathews, Audrey Gilles, Carole Fantini-Hauwel, and Marco Dorigo. 2016. Investigating the effect of increasing robot group sizes on the human psychophysiological state in the context of human-swarm interaction. Swarm Intell. 10, 3 (2016), 1--18.
[49]
Y. Qi. 2012. Random forest for bioinformatics. Ensemble Machine Learning: Methods and Applications. Springer US, 307--323.
[50]
Valérie Renaudin, Melania Susi, and Gérard Lachapelle. 2012. Step length estimation using handheld inertial sensors. Sensors 12, 7 (2012), 8507--8525.
[51]
C. Salter. 2010. Entangled: Technology and the Transformation of Performance. MIT Press, Cambridge, MA.
[52]
A. Sanna, F. Lamberti, G. Paravati, and F. Manuri. 2013. A Kinect-based natural interface for quadrotor control. Entertain. Comput. 4, 3 (2013), 179--186.
[53]
Mithileysh Sathiyanarayanan. 2016. MYO armband for physiotherapy healthcare: A case study using gesture recognition application. In Proceedings of the 2016 8th International Conference on Communication Systems and Networks (COMSNETS’16). 1--6.
[54]
Erik Scheme and Kevin Englehart. 2011. Electromyogram pattern recognition for control of powered upper-limb prostheses: State of the art and challenges for clinical use. J. Rehabil. Res. Dev. 48, 6 (2011), 643--660.
[55]
Jonas Schmidtler, Klaus Bengler, F. Dimeas, and A. Campeau-Lecours. 2017. A questionnaire for the evaluation of physical assistive devices (QUEAD). In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics. Banff.
[56]
R. Simmons and H. Knight. 2017. Keep on dancing: Effects of expressive motion mimicry. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN’17). 720--727.
[57]
T. R. Song, J. H. Park, S. M. Jung, and J. W. Jeon. 2007. The development of interface device for human robot interaction. In Proceedings of the International Conference on Control, Automation and Systems. 640--643.
[58]
Nitish Srivastava, Geoffrey E. Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1 (2014), 1929--1958.
[59]
David St-Onge, Ulysse Côté-Allard, and Giovanni Beltrame. 2018a. Expressive motion with dancers. IEEE Dataport. Retrieved on May 27, 2019 from
[60]
David St-Onge, Ulysse Côté-Allard, and Giovanni Beltrame. 2018b. Retrieved December 2018 from. https://www.youtube.com/watch?v=KNapoKTL4wM.
[61]
Gentiane Venture, Takumi Yabuki, Yuta Kinase, Alain Berthoz, and Naoko Abe. 2016. Using Dynamics to Recognize Human Motion. Springer International Publishing, Cham, 361--376.
[62]
Bill Vorn. 2016. I Want to Believe—Empathy and Catharsis in Robotic Art BT; Robots and Art: Exploring an Unlikely Symbiosis. Springer Singapore, Singapore, 365--377.
[63]
Phillip Walker and Steven Nunnally. 2012. Neglect benevolence in human control of swarms in the presence of latency. In Proceedings of the International Conference on Robotics Automation. 3009--3014.
[64]
F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler. 2013. Analysis of the accuracy and robustness of the leap motion controller. Sensors 13, 5 (2013), 6380--6393.
[65]
A. Xu, G. Dudek, and J. Sattar. 2008. A natural gesture interface for operating robotic systems. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation (ICRA’08). 3557--3563.
[66]
B. Xu, J. Z. Huang, G. Williams, Q. Wang, and Y. Ye. 2012. Classifying very high-dimensional data with random forests built from small subspaces. Int. J. Data Warehous. Min. 8, 2 (Apr.--Jun. 2012).

Cited By

View all
  • (2024)Towards the Legibility of Multi-robot SystemsACM Transactions on Human-Robot Interaction10.1145/364798413:2(1-32)Online publication date: 14-Jun-2024
  • (2024)Overcoming boundaries: Interdisciplinary challenges and opportunities in cognitive neuroscienceNeuropsychologia10.1016/j.neuropsychologia.2024.108903200(108903)Online publication date: Jul-2024
  • (2022)Adoption of Artificial Intelligence Along with Gesture Interactive Robot in Musical Perception Education Based on Deep Learning MethodInternational Journal of Humanoid Robotics10.1142/S021984362240008419:03Online publication date: 8-Jul-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Human-Robot Interaction
ACM Transactions on Human-Robot Interaction  Volume 8, Issue 2
June 2019
136 pages
EISSN:2573-9522
DOI:10.1145/3339062
Issue’s Table of Contents
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 June 2019
Accepted: 01 March 2019
Revised: 01 January 2019
Received: 01 April 2018
Published in THRI Volume 8, Issue 2

Check for updates

Author Tags

  1. Machine learning
  2. body motion
  3. gesture recognition
  4. human--swarm interaction
  5. natural user interface

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • NSERC
  • Research Council of Norway

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)213
  • Downloads (Last 6 weeks)22
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Towards the Legibility of Multi-robot SystemsACM Transactions on Human-Robot Interaction10.1145/364798413:2(1-32)Online publication date: 14-Jun-2024
  • (2024)Overcoming boundaries: Interdisciplinary challenges and opportunities in cognitive neuroscienceNeuropsychologia10.1016/j.neuropsychologia.2024.108903200(108903)Online publication date: Jul-2024
  • (2022)Adoption of Artificial Intelligence Along with Gesture Interactive Robot in Musical Perception Education Based on Deep Learning MethodInternational Journal of Humanoid Robotics10.1142/S021984362240008419:03Online publication date: 8-Jul-2022
  • (2021)Dance Motion Capture Based on Data Fusion Algorithm and Wearable Sensor NetworkComplexity10.1155/2021/26562752021Online publication date: 1-Jan-2021
  • (2021)Molecular HCIProceedings of the ACM on Human-Computer Interaction10.1145/34617335:EICS(1-33)Online publication date: 29-May-2021
  • (2021)A Transferable Adaptive Domain Adversarial Neural Network for Virtual Reality Augmented EMG-Based Gesture RecognitionIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2021.305974129(546-555)Online publication date: 2021
  • (2021)Exploring Communicatory Gestures for Simple Multi-robot SystemsSocial Robotics10.1007/978-3-030-90525-5_78(819-823)Online publication date: 10-Nov-2021
  • (2020)Interpreting Deep Learning Features for Myoelectric Control: A Comparison With Handcrafted FeaturesFrontiers in Bioengineering and Biotechnology10.3389/fbioe.2020.001588Online publication date: 3-Mar-2020
  • (2020)Measuring Cognitive Load: Heart-rate Variability and Pupillometry AssessmentCompanion Publication of the 2020 International Conference on Multimodal Interaction10.1145/3395035.3425203(405-410)Online publication date: 25-Oct-2020
  • (2020)Planetary Exploration With Robot Teams: Implementing Higher Autonomy With Swarm IntelligenceIEEE Robotics & Automation Magazine10.1109/MRA.2019.294041327:2(159-168)Online publication date: Jun-2020
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media