Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3564982.3564996acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicacsConference Proceedingsconference-collections
research-article

Hand and Arm Gesture-based Human-Robot Interaction: A Review

Published: 30 January 2023 Publication History

Abstract

The study of Human-Robot Interaction (HRI) aims to create close and friendly communication between humans and robots. In the human-center HRI, an essential aspect of implementing a successful and effective HRI is building a natural and intuitive interaction, including verbal and nonverbal. As a prevalent nonverbally communication approach, hand and arm gesture communication happen ubiquitously in our daily life. A considerable amount of work on gesture-based HRI is scattered in various research domains. However, a systematic understanding of the works on gesture-based HRI is still lacking. This paper intends to provide a comprehensive review of gesture-based HRI and focus on the advanced finding in this area. Following the stimulus-organism-response framework, this review consists of: (i) Generation of human gesture(stimulus). (ii) Robot recognition of human gesture(organism). (iii) Robot reaction to human gesture(response). Besides, this review summarizes the research status of each element in the framework and analyze the advantages and disadvantages of related works. Toward the last part, this paper discusses the current research challenges on gesture-based HRI and provide possible future directions.

References

[1]
Julian Abich and Daniel J Barber. 2017. The impact of human–robot multimodal communication on mental workload, usability preference, and expectations of robot behavior. Journal on Multimodal User Interfaces 11, 2, 211–225.
[2]
Cynthia Breazeal, Cory D Kidd, Andrea Lockerd Thomaz, Guy Hoffman, and Matt Berlin. 2005. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In 2005 IEEE/RSJ international conference on intelligent robots and systems. IEEE, 708–713.
[3]
Judee K Burgoon, Laura K Guerrero, and Valerie Manusov. 2016. Nonverbal communication. Routledge.
[4]
Gerard Canal, Sergio Escalera, and Cecilio Angulo. 2016. A real-time human-robot interaction system based on gestures for assistive scenarios. Computer Vision and Image Understanding 149, 65–77.
[5]
Ming Jin Cheok, Zaid Omar, and Mohamed Hisham Jaward. 2019. A review of hand gesture and sign language recognition techniques. International Journal of Machine Learning and Cybernetics 10, 1, 131–153.
[6]
Jan de Wit, Thorsten Schodde, Bram Willemsen, Kirsten Bergmann, Mirjam de Haas, Stefan Kopp, Emiel Krahmer, and Paul Vogt. 2018. The effect of a robot's gestures and adaptive tutoring on children's acquisition of second language vocabularies. In Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction. 50–58.
[7]
Elisa Digo, Laura Gastaldi, Mattia Antonelli, Stefano Pastorelli, Andrea Cereatti, and Marco Caruso. 2022. Real-time estimation of upper limbs kinematics with IMUs during typical industrial gestures. Procedia Computer Science 200, 1041–1047.
[8]
Tobias Ende, Sami Haddadin, Sven Parusel, Tilo Wüsthoff, Marc Hassenzahl, and Alin Albu-Schäffer. 2011. A human- centered approach to robot gesture based communication within collaborative working processes. In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 3367–3374.
[9]
David Feil-Seifer and Maja J Mataric. 2005. Defining socially assistive robotics. In 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005. IEEE, 465–468.
[10]
Tatsuya Fujii, Jae Hoon Lee, and Shingo Okamoto. 2014. Gesture recognition system for human-robot interaction and its application to robotic service task. In Proc. of the International Multi-Conference of Engineers and Computer Scientists, Vol. 1.
[11]
Qing Gao, Jinguo Liu, and Zhaojie Ju. 2020. Hand gesture recognition using multimodal data fusion and multiscale parallel convolutional neural network for human–robot interaction. Expert Systems, e12490.
[12]
Mazen Ghandour, Hui Liu, Norbert Stoll, and Kerstin Thurow. 2016. A hybrid collision avoidance system for indoor mobile robots based on human-robot interaction. In 2016 17th International Conference on Mechatronics-Mechatronika. IEEE, 1–7.
[13]
Brian Gleeson, Karon MacLean, Amir Haddadi, Elizabeth Croft, and Javier Alcazar. 2013. Gestures for industry intuitive human-robot communication from human observation. In 2013 8th ACM/IEEE International Conference on Human-Robot Interaction. IEEE, 349–356.
[14]
Amir Haddadi, Elizabeth A Croft, Brian T Gleeson, Karon MacLean, and Javier Alcazar. 2013. Analysis of task- based gestures in human-robot interaction. In 2013 IEEE International Conference on Robotics and Automation. IEEE, 2146–2152.
[15]
Guy Hoffman, Oren Zuckerman, Gilad Hirschberger, Michal Luria, and Tal Shani-Sherman. 2015. Design and evaluation of a peripheral robotic conversation companion. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction. IEEE, 3–10.
[16]
Md Jahidul Islam, Marc Ho, and Junaed Sattar. 2018. Dynamic reconfiguration of mission parameters in underwater human-robot collaboration. In 2018 IEEE International Conference on Robotics and Automation. IEEE, 1–8.
[17]
Mithun George Jacob, Yu-Ting Li, and Juan P Wachs. 2012. Gestonurse: a multimodal robotic scrub nurse. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. 153–154.
[18]
Jacob Jacoby. 2002. Stimulus-organism-response reconsidered: an evolutionary step in modeling behavior.Journal of consumer psychology 12, 1, 51–57.
[19]
S Jeffrey and T DeSocio. 2016. Meet Wally. The Room Service Robot of the Residence Inn Marriott at LAX. Fox11.
[20]
Takahiro Kanokoda, Yuki Kushitani, Moe Shimada, and Jun-ichi Shirakashi. 2019. Gesture prediction using wearable sensing systems with neural networks for temporal data analysis. Sensors 19, 3, 710.
[21]
Maria Karam 2005. A taxonomy of gestures in human computer interactions.
[22]
Hideki Kozima, Cocoro Nakagawa, and Yuriko Yasuda. 2005. Interactive robots for communication-care: A case-study in autism therapy. In ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005. IEEE, 341–346.
[23]
Dennis Krupke, Frank Steinicke, Paul Lubos, Yannick Jonetzko, Michael Görner, and Jianwei Zhang. 2018. Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1–9.
[24]
Hongyi Liu and Lihui Wang. 2018. Gesture recognition for human-robot collaboration: A review. International Journal of Industrial Ergonomics 68, 355–367.
[25]
Jinguo Liu, Yifan Luo, and Zhaojie Ju. 2016. An interactive astronaut-robot system with gesture control. Computational intelligence and neuroscience 2016.
[26]
Osama Mazhar, Sofiane Ramdani, Benjamin Navarro, Robin Passama, and Andrea Cherubini. 2018. Towards real- time physical human-robot interaction using skeleton information and hand gestures. In 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 1–6.
[27]
David McNeill. 1992. Hand and mind: What gestures reveal about thought. University of Chicago press.
[28]
Clifford Nass and Youngme Moon. 2000. Machines and mindlessness: Social responses to computers. Journal of social issues 56, 1, 81–103.
[29]
Chrystopher L Nehaniv, Kerstin Dautenhahn, Jens Kubacki, Martin Haegele, Christopher Parlitz, and Rachid Alami. 2005. A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction. In ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005. IEEE, 371–377.
[30]
Pedro Neto, Miguel Simão, Nuno Mendes, and Mohammad Safeea. 2019. Gesture-based human-robot interaction for human assistance in manufacturing. The International Journal of Advanced Manufacturing Technology 101, 1-4, 119–135.
[31]
Kai Nickel and Rainer Stiefelhagen. 2007. Visual recognition of pointing gestures for human–robot interaction. Image and vision computing 25, 12, 1875–1884.
[32]
Cristina Nuzzi, Simone Pasinetti, Roberto Pagani, Stefano Ghidini, Manuel Beschi, Gabriele Coffetti, and Giovanna Sansoni. 2021. MEGURU: a gesture-based robot program builder for Meta-Collaborative workstations. Robotics and Computer-Integrated Manufacturing 68, 102085.
[33]
Jinxian Qi, Guozhang Jiang, Gongfa Li, Ying Sun, and Bo Tao. 2019. Intelligent human-computer interaction based on surface EMG gesture recognition. IEEE Access 7, 61378–61387.
[34]
Wen Qi, Salih Ertug Ovur, Zhijun Li, Aldo Marzullo, and Rong Song. 2021. Multi-Sensor Guided Hand Gesture Recognition for a Teleoperated Robot Using a Recurrent Neural Network. IEEE Robotics and Automation Letters 6, 3, 6039–6045.
[35]
Wen Qi, Salih Ertug Ovur, Zhijun Li, Aldo Marzullo, and Rong Song. 2021. Multi-Sensor Guided Hand Gesture Recognition for a Teleoperated Robot Using a Recurrent Neural Network. IEEE Robotics and Automation Letters 6, 3, 6039–6045.
[36]
Shuwen Qiu, Hangxin Liu, Zeyu Zhang, Yixin Zhu, and Song-Chun Zhu. 2020. Human-Robot Interaction in a Shared Augmented Reality Workspace. arXiv preprint arXiv:2007.12656.
[37]
Francis Quek, David McNeill, Robert Bryll, Susan Duncan, Xin-Feng Ma, Cemil Kirbas, Karl E McCullough, and Rashid Ansari. 2002. Multimodal human discourse: gesture and speech. ACM Transactions on Computer-Human Interaction 9, 3, 171–193.
[38]
Camilo Perez Quintero, Romeo Tatsambon, Mona Gridseth, and Martin Jägersand. 2015. Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. In 2015 24th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 349–354.
[39]
Zhou Ren, Jingjing Meng, and Junsong Yuan. 2011. Depth camera based hand gesture recognition and its applications in human-computer-interaction. In 2011 8th International Conference on Information, Communications & Signal Processing. IEEE, 1–5.
[40]
Kestas Rimkus, Audrius Bukis, Arunas Lipnickas, and Saulius Sinkevičius. 2013. 3D human hand motion recognition system. In 2013 6th International Conference on Human System Interactions. IEEE, 180–183.
[41]
Silvia Rossi, Enrico Leone, Michelangelo Fiore, Alberto Finzi, and Francesco Cutugno. 2013. An extensible architecture for robust multimodal human-robot communication. In 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 2208–2213.
[42]
Maha Salem, Friederike Eyssel, Katharina Rohlfing, Stefan Kopp, and Frank Joublin. 2011. Effects of gesture on the perception of psychological anthropomorphism: a case study with a humanoid robot. In International Conference on Social Robotics. Springer, 31–41.
[43]
Maha Salem, Stefan Kopp, Ipke Wachsmuth, Katharina Rohlfing, and Frank Joublin. 2012. Generation and evaluation of communicative robot gesture. International Journal of Social Robotics 4, 2, 201–217.
[44]
Maha Salem, Katharina Rohlfing, Stefan Kopp, and Frank Joublin. 2011. A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction. In 2011 Ro-Man. IEEE, 247–252.
[45]
Arpita Ray Sarkar, G Sanyal, and SJIJOCA Majumder. 2013. Hand gesture recognition systems: a survey. International Journal of Computer Applications 71, 15.
[46]
Shane Saunderson and Goldie Nejat. 2019. How robots influence humans: A survey of nonverbal communication in social human–robot interaction. International Journal of Social Robotics 11, 4, 575–608.
[47]
Sara Sheikholeslami, AJung Moon, and Elizabeth A Croft. 2015. Exploring the effect of robot hand configurations in directional gestures for human-robot interaction. In 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 3594–3599.
[48]
Yu Shi, Ronnie Taib, and Serge Lichman. 2006. GestureCam: a smart camera for gesture recognition and gesture- controlled web navigation. In 2006 9th International Conference on Control, Automation, Robotics and Vision. IEEE, 1–6.
[49]
Dadhichi Shukla, Özgür Erkent, and Justus Piater. 2016. A multi-view hand gesture rgb-d dataset for human-robot interaction scenarios. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 1084–1091.
[50]
Dadhichi Shukla, Özgür Erkent, and Justus Piater. 2017. Supervised learning of gesture-action associations for human- robot collaboration. In 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition. IEEE, 778–783.
[51]
Dadhichi Shukla, Özgür Erkent, and Justus Piater. 2018. Learning semantics of gestural instructions for human-robot collaboration. Frontiers in neurorobotics 12, 7.
[52]
Fumihide Tanaka, Aaron Cicourel, and Javier R Movellan. 2007. Socialization between toddlers and robots at an early childhood education center. Proceedings of the National Academy of Sciences 104, 46, 17954–17958.
[53]
Lou W Turley and Ronald E Milliman. 2000. Atmospheric effects on shopping behavior: a review of the experimental evidence. Journal of business research 49, 2, 193–211.
[54]
Michael Van den Bergh and Luc Van Gool. 2011. Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In 2011 IEEE workshop on applications of computer vision. IEEE, 66–72.
[55]
Phongtharin Vinayavekhin, Michiaki Tatsubori, Daiki Kimura, Yifan Huang, Giovanni De Magistris, Asim Munawar, and Ryuki Tachibana. 2017. Human-like hand reaching by motion prediction using long short-term memory. In International Conference on Social Robotics. Springer, 156–166.
[56]
Juan Pablo Wachs, Mathias Kölsch, Helman Stern, and Yael Edan. 2011. Vision-based hand-gesture applications. Commun. ACM 54, 2, 60–71.
[57]
Indika B Wijayasinghe, Isura Ranatunga, Namrata Balakrishnan, Nicoleta Bugnariu, and Dan O Popa. 2016. Human– robot gesture analysis for objective assessment of autism spectrum disorder. International Journal of Social Robotics 8, 5, 695–707.
[58]
Rui R Yan, Keng Peng KP Tee, Yuanwei Y Chua, Haizhou H Li, and Huajin H Tang. 2012. Gesture recognition based on localist attractor networks with application to robot control [application notes]. IEEE Computational Intelligence Magazine 7, 1, 64–74.
[59]
Tao Zhang, Biwen Zhu, Lashanda Lee, and David Kaber. 2008. Service robot anthropomorphism and interface design for emotion in human-robot interaction. In 2008 IEEE International Conference on Automation Science and Engineering. IEEE, 674–679.
[60]
Zhi Zheng, Eric M Young, Amy R Swanson, Amy S Weitlauf, Zachary E Warren, and Nilanjan Sarkar. 2015. Robot- mediated imitation skill training for children with autism. IEEE Transactions on Neural Systems and Rehabilitation Engineering 24, 6, 682–691.

Cited By

View all
  • (2024)Socially adaptive cognitive architecture for human-robot collaboration in industrial settingsFrontiers in Robotics and AI10.3389/frobt.2024.124864611Online publication date: 10-Jun-2024
  • (2024)LaserDex: Improvising Spatial Tasks Using Deictic Gestures and Laser Pointing for Human–Robot Collaboration in ConstructionJournal of Computing in Civil Engineering10.1061/JCCEE5.CPENG-571538:3Online publication date: May-2024
  • (2024)Augmented Reality Interface for UR5e Robot that Transfers Parts to a Human in Collaborative AssemblyInteractive Collaborative Robotics10.1007/978-3-031-71360-6_1(1-14)Online publication date: 4-Sep-2024

Index Terms

  1. Hand and Arm Gesture-based Human-Robot Interaction: A Review

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICACS '22: Proceedings of the 6th International Conference on Algorithms, Computing and Systems
    September 2022
    132 pages
    ISBN:9781450397407
    DOI:10.1145/3564982
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 30 January 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gesture-based Communication
    2. Human-robot Interaction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Fujian Science & Technology Innovation Laboratory for Optoelectronic Information of China, Fuzhou, Fujian 350108, P. R. China

    Conference

    ICACS 2022

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)114
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 01 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Socially adaptive cognitive architecture for human-robot collaboration in industrial settingsFrontiers in Robotics and AI10.3389/frobt.2024.124864611Online publication date: 10-Jun-2024
    • (2024)LaserDex: Improvising Spatial Tasks Using Deictic Gestures and Laser Pointing for Human–Robot Collaboration in ConstructionJournal of Computing in Civil Engineering10.1061/JCCEE5.CPENG-571538:3Online publication date: May-2024
    • (2024)Augmented Reality Interface for UR5e Robot that Transfers Parts to a Human in Collaborative AssemblyInteractive Collaborative Robotics10.1007/978-3-031-71360-6_1(1-14)Online publication date: 4-Sep-2024
    • (2024)Intuitive Multi-modal Human-Robot Interaction via Posture and VoiceRobotics, Computer Vision and Intelligent Systems10.1007/978-3-031-59057-3_28(441-456)Online publication date: 8-May-2024

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media