Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3125739.3132590acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article

Impact of Spontaneous Human Inputs during Gesture based Interaction on a Real-World Manufacturing Scenario

Published: 27 October 2017 Publication History

Abstract

Seamless human-robot collaboration depends on high non-verbal behaviour recognition rates. To realize that in real-world manufacturing scenarios with an ecological valid setup, a lot of effort has to be invested. In this paper, we evaluate the impact of spontaneous inputs on the robustness of human-robot collaboration during gesture-based interaction. A high share of these spontaneous inputs lead to a reduced capability to predict behaviour and subsequently to a loss of robustness. We observe body and hand behaviour during interactive manufacturing of a collaborative task within two experiments. First, we analyse the occurrence frequency, reason and manner of human inputs in specific situations during a human-human experiment. We show the high impact of spontaneous inputs, especially in situations that differ from the typical working procedure. Second, we concentrate on implicit inputs during a real-world Wizard of Oz experiment using our human-robot working cell. We show that hand positions can be used to anticipate user needs in a semi-structured environment by applying knowledge about the semi-structured human behaviour which is distributed over working space and time in a typical manner.

References

[1]
C. Breazeal, C. D. Kidd, A. L. Thomaz, G. Hoffman, and M. Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEERSJ International Conference on Intelligent Robots and Systems. IEEE Operations Center, Piscataway N.J., 708--713.
[2]
Gerard Canal, Sergio Escalera, and Cecilio Angulo. 2016. A real-time Human-Robot Interaction system based on gestures for assistive scenarios. Computer Vision and Image Understanding 149 (2016), 65--77.
[3]
Fei Chen, Kosuke Sekiyama, Ferdinando Cannella, and Toshio Fukuda. 2014. Optimal Subtask Allocation for Human and Robot Collaboration Within Hybrid Assembly System. IEEE Transactions on Automation Science and Engineering 11, 4 (2014), 1065--1075.
[4]
Ken Hinckley, Jeff Pierce, Eric Horvitz, and Mike Sinclair. 2005. Foreground and background interaction with sensor-enhanced mobile devices. ACM Transactions on Computer-Human Interaction 12, 1 (2005), 31--52.
[5]
Johannes Hoecherl, Thomas Schlegl, Thomas Berlehner, Harald Kuhn, and Britta Wrede. 2016. SmartWorkbench:Toward Adaptive and Transparent User Assistance in Industrial Human-Robot Applications. In Robotics in the era of digitalisation. VDE Verlag, Berlin and Offenbach.
[6]
Oliver Korn, Markus Funk, Stephan Abele, Thomas Hörz, and Albrecht Schmidt. 2014. Context-aware assistive systems at the workplace. In Proceedings of PETRA 2014 (ICPS), F. Makedon, Mark Clements, Catherine Pelachaud, Vana Kalogeraki, and Ilias G. Maglogiannis (Eds.). ACM, New York, 1--8.
[7]
Bettina Laugwitz, Theo Held, and Martin Schrepp. 1993. Construction and Evaluation of a User Experience Questionnaire. In An introduction to the physics of high energy accelerators, D. A. Edwards and M. J. Syphers (Eds.). Wiley, New York, 1--17.
[8]
Joseph J. LaViola. 2013. 3D Gestural Interaction:The State of the Field. ISRN Artificial Intelligence 2013, 2 (2013), 1--18.
[9]
Sascha Niedersteiner, Clemens Pohlt, and Thomas Schlegl. Smart Workbench:A multimodal and bidirectional assistance system for industrial application. In IECON 2015 - 41st Annual Conference of the IEEE Industrial Electronics Society. 002938--002943.
[10]
Stefan Profanter, Alexander Perzylo, Nikhil Somani, Markus Rickert, and Alois Knoll. Analysis and semantic modeling of modality preferences in industrial human-robot interaction. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 1812--1818.
[11]
Kai Vogeley and Gary Bente. 2010. Artificial humans:Psychology and neuroscience perspectives on embodiment and nonverbal communication. Neural networks :the official journal of the International Neural Network Society 23, 8--9 (2010), 1077--1090.

Cited By

View all
  • (2023)A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative ScenariosInternational Journal of Social Robotics10.1007/s12369-023-00991-516:6(1315-1332)Online publication date: 30-Mar-2023
  • (2020)Weakly-Supervised Learning for Multimodal Human Activity Recognition in Human-Robot Collaboration Scenarios2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS45743.2020.9340788(8381-8386)Online publication date: 24-Oct-2020
  • (2019)Human Work Activity Recognition for Working Cells in Industrial Production Contexts2019 IEEE International Conference on Systems, Man and Cybernetics (SMC)10.1109/SMC.2019.8913873(4225-4230)Online publication date: Oct-2019
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HAI '17: Proceedings of the 5th International Conference on Human Agent Interaction
October 2017
550 pages
ISBN:9781450351133
DOI:10.1145/3125739
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 October 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gesture recognition
  2. human-robot collaboration
  3. hybrid manufactoring
  4. non-verbal interaction

Qualifiers

  • Research-article

Conference

HAI '17
Sponsor:

Acceptance Rates

Overall Acceptance Rate 121 of 404 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)2
Reflects downloads up to 11 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative ScenariosInternational Journal of Social Robotics10.1007/s12369-023-00991-516:6(1315-1332)Online publication date: 30-Mar-2023
  • (2020)Weakly-Supervised Learning for Multimodal Human Activity Recognition in Human-Robot Collaboration Scenarios2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS45743.2020.9340788(8381-8386)Online publication date: 24-Oct-2020
  • (2019)Human Work Activity Recognition for Working Cells in Industrial Production Contexts2019 IEEE International Conference on Systems, Man and Cybernetics (SMC)10.1109/SMC.2019.8913873(4225-4230)Online publication date: Oct-2019
  • (2018)Effects on User Experience During Human-Robot Collaboration in Industrial Scenarios2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC.2018.00150(837-842)Online publication date: Oct-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media