Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3319502.3374829acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article

Designing Social Cues for Collaborative Robots: The Role of Gaze and Breathing in Human-Robot Collaboration

Published: 09 March 2020 Publication History

Abstract

In this paper, we investigate how collaborative robots, or cobots, typically composed of a robotic arm and a gripper carrying out manipulation tasks alongside human coworkers, can be enhanced with HRI capabilities by applying ideas and principles from character animation. To this end, we modified the appearance and behaviors of a cobot, with minimal impact on its functionality and performance, and studied the extent to which these modifications improved its communication with and perceptions by human collaborators. Specifically, we aimed to improve the Appeal of the robot by manipulating its physical appearance, posture, and gaze, creating an animal-like character with a head-on-neck morphology; to utilize Arcs by generating smooth trajectories for the robot arm; and to increase the lifelikeness of the robot through Secondary Action by adding breathing motions to the robot. In two user studies, we investigated the effects of these cues on collaborator perceptions of the robot. Findings from our first study showed breathing to have a positive effect on most measures of robot perception and reveal nuanced interactions among the other factors. Data from our second study showed that, using gaze cues alone, a robot arm can improve metrics such as likeability and perceived sociability.

Supplementary Material

MP4 File (p343-terzioglu.mp4)

References

[1]
Henny Admoni and Brian Scassellati. 2017. Social eye gaze in human-robot interaction: a review. Journal of Human-Robot Interaction 6, 1 (2017), 25--63.
[2]
Baris Akgun, Maya Cakmak, Jae Wook Yoo, and Andrea Lockerd Thomaz. 2012. Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. ACM, 391--398.
[3]
Heni Ben Amor, Gerhard Neumann, Sanket Kamthe, Oliver Kroemer, and Jan Peters. 2014. Interaction primitives for human-robot cooperation tasks. In 2014 IEEE international conference on robotics and automation (ICRA). IEEE, 2831--2837.
[4]
Christoph Bartneck, Dana Kulic, Elizabeth Croft, and Susana Zoghbi. 2009. Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International journal of social robotics 1, 1 (2009), 71--81.
[5]
Christopher Bodden, Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2016. Evaluating intent-expressive robot arm motion. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). IEEE, 658--663.
[6]
Christopher Bodden, Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2018. A flexible optimization-based method for synthesizing intent-expressive robot arm motion. The International Journal of Robotics Research 37, 11 (2018), 1376--1394.
[7]
Ed Colgate, Witaya Wannasuphoprasit, and Michael Peshkin. 1996. Cobots: robots for collaboration with human operators. In Proceedings of the 1996 ASME International Mechanical Engineering Congress and Exposition. ASME, 433--439.
[8]
Nils Dahlbäck, Arne Jönsson, and Lars Ahrenberg. 1993. Wizard of Oz studies - wizard. Knowledge-based systems 6, 4 (1993), 258--266.
[9]
Anca D Dragan, Shira Bauman, Jodi Forlizzi, and Siddhartha S Srinivasa. 2015. Effects of robot motion on human-robot collaboration. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction. ACM, 51--58.
[10]
Anca D Dragan, Kenton CT Lee, and Siddhartha S Srinivasa. 2013. Legibility and predictability of robot motion. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction. IEEE Press, 301--308.
[11]
Tobias Ende, Sami Haddadin, Sven Parusel, Tilo Wüsthoff, Marc Hassenzahl, and Alin Albu-Schäffer. 2011. A human-centered approach to robot gesture based communication within collaborative working processes. In 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE, 3367--3374.
[12]
Michael J Gielniak, C Karen Liu, and Andrea L Thomaz. 2010. Secondary action in robot motion. In 19th International Symposium in Robot and Human Interactive Communication. IEEE, 310--315.
[13]
Marcel Heerink, Ben Krose, Vanessa Evers, and Bob Wielinga. 2009. Measuring acceptance of an assistive social robot: a suggested toolkit. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 528--533.
[14]
Rachel M Holladay, Anca D Dragan, and Siddhartha S Srinivasa. 2014. Legible robot pointing. In The 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 217--223.
[15]
John Lasseter. 1987. Principles of traditional animation applied to 3D computer animation. In ACM Siggraph Computer Graphics, Vol. 21. ACM, 35--44.
[16]
International Federation of Robotics. 2018. Demystifying Collaborative Industrial Robots. Technical Report.
[17]
Margaret Pearce, Bilge Mutlu, Julie Shah, and Robert Radwin. 2018. Optimizing makespan and ergonomics in integrating collaborative robots into manufacturing processes. IEEE Transactions on Automation Science and Engineering 15, 4 (2018), 1772--1784.
[18]
Tiago Ribeiro and Ana Paiva. 2012. The illusion of robotic life: principles and practices of animation for robots. In 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 383--390.
[19]
Laurel D Riek. 2012. Wizard of Oz Studies in HRI: A Systematic Review and New Reporting Guidelines. Journal of Human-Robot Interaction 1, 1 (2012), 119--136.
[20]
K. Ruhland, S. Andrist, J. B. Badler, C. E. Peters, N. I. Badler, M. Gleicher, B. Mutlu, and R. McDonnell. 2014. Look me in the Eyes: A Survey of Eye and Gaze Animation for Virtual Agents and Artificial Systems. In Eurographics 2014 - State of the Art Reports, Sylvain Lefebvre and Michela Spagnuolo (Eds.). The Eurographics Association. https://doi.org/10.2312/egst.20141036
[21]
Allison Sauppé and Bilge Mutlu. 2015. The social impact of a robot co-worker in industrial settings. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, 3613--3622.
[22]
Sara Sheikholeslami, AJung Moon, and Elizabeth A Croft. 2017. Cooperative gestures for industry: Exploring the efficacy of robot hand configurations in expression of instructional gestures for human--robot interaction. The International Journal of Robotics Research 36, 5--7 (2017), 699--720.
[23]
Reid Simmons, Maxim Makatchev, Rachel Kirby, Min Kyung Lee, Imran Fanaswala, Brett Browning, Jodi Forlizzi, and Majd Sakr. 2011. Believable robot characters. AI Magazine 32, 4 (2011), 39--52.
[24]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2014. Communication of intent in assistive free flyers. In 2014 9th ACM/IEEE International Conference on Human- Robot Interaction (HRI). IEEE, 358--365.
[25]
Daniel Szafir, Bilge Mutlu, and Terrence Fong. 2015. Communicating directionality in flying robots. In 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 19--26.
[26]
Leila Takayama, Doug Dooley, and Wendy Ju. 2011. Expressing thought: improving robot readability with animation principles. In 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 69--76.
[27]
Frank Thomas and Ollie Johnston. 1981. The Illusion of Life: Disney Animation. Hyperion.
[28]
AJN Van Breemen. 2004. Bringing robots to life: Applying principles of animation to robots. In Proceedings of Shapping Human-Robot Interaction workshop held at CHI, Vol. 2004. Citeseer, 143--144.
[29]
Albert van Breemen, Xue Yan, and Bernt Meerbeek. 2005. iCat: an animated user-interface robot with personality. In Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems. ACM, 143--144.
[30]
D. Whitney, M. Eldon, J. Oberlin, and S. Tellex. 2016. Interpreting multimodal referring expressions in real time. In 2016 IEEE International Conference on Robotics and Automation (ICRA). 3331--3338. https://doi.org/10.1109/ICRA.2016.7487507

Cited By

View all
  • (2024)A Generative Model to Embed Human Expressivity into Robot MotionsSensors10.3390/s2402056924:2(569)Online publication date: 16-Jan-2024
  • (2024)The effect of conversation on altruism: A comparative study with different media and generationsPLOS ONE10.1371/journal.pone.030176919:6(e0301769)Online publication date: 14-Jun-2024
  • (2024)The IDEA of Us: An Identity-Aware Architecture for Autonomous SystemsACM Transactions on Software Engineering and Methodology10.1145/365443933:6(1-38)Online publication date: 28-Jun-2024
  • Show More Cited By

Index Terms

  1. Designing Social Cues for Collaborative Robots: The Role of Gaze and Breathing in Human-Robot Collaboration

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
        March 2020
        690 pages
        ISBN:9781450367462
        DOI:10.1145/3319502
        Publication rights licensed to ACM. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 09 March 2020

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. animation principles
        2. character design
        3. collaborative robots
        4. human-robot collaboration
        5. robot motion
        6. social cues

        Qualifiers

        • Research-article

        Conference

        HRI '20
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 268 of 1,124 submissions, 24%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)272
        • Downloads (Last 6 weeks)27
        Reflects downloads up to 15 Oct 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)A Generative Model to Embed Human Expressivity into Robot MotionsSensors10.3390/s2402056924:2(569)Online publication date: 16-Jan-2024
        • (2024)The effect of conversation on altruism: A comparative study with different media and generationsPLOS ONE10.1371/journal.pone.030176919:6(e0301769)Online publication date: 14-Jun-2024
        • (2024)The IDEA of Us: An Identity-Aware Architecture for Autonomous SystemsACM Transactions on Software Engineering and Methodology10.1145/365443933:6(1-38)Online publication date: 28-Jun-2024
        • (2024)Respiration-enhanced Human-Robot CommunicationCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640707(813-816)Online publication date: 11-Mar-2024
        • (2024)PnP-GA+: Plug-and-Play Domain Adaptation for Gaze Estimation using Model VariantsIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.3348528(1-14)Online publication date: 2024
        • (2024)Trust dynamics in human interaction with an industrial robotBehaviour & Information Technology10.1080/0144929X.2024.2316284(1-23)Online publication date: 16-Feb-2024
        • (2024)Attention Sharing Handling Through Projection Capability Within Human–Robot CollaborationInternational Journal of Social Robotics10.1007/s12369-024-01101-9Online publication date: 16-Feb-2024
        • (2024)From Gaze Jitter to Domain Adaptation: Generalizing Gaze Estimation by Manipulating High-Frequency ComponentsInternational Journal of Computer Vision10.1007/s11263-024-02233-1Online publication date: 30-Sep-2024
        • (2024)Residual feature learning with hierarchical calibration for gaze estimationMachine Vision and Applications10.1007/s00138-024-01545-z35:4Online publication date: 5-May-2024
        • (2023)Prosocial behavior among human workers in robot-augmented production teams—A field-in-the-lab experimentFrontiers in Behavioral Economics10.3389/frbhe.2023.12205632Online publication date: 6-Nov-2023
        • Show More Cited By

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media