Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3284432.3284458acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article
Open access

Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions

Published: 04 December 2018 Publication History

Abstract

In this paper, we explore how a utility robot might express emotions via expressive lights and in-situ motions. In most previous work, methods for either modality were investigated alone, leaving a huge potential to improve the expression of emotions by combining the two modalities. We present a series of three studies, one for investigating how well people might recognize emotions on the basis of expressive light cues alone, one for exploring how people might perceive affect towards in-situ motion characteristics, and one for further combining the two modalities and studying whether multi-modal expressions could be better recognized by people. Results from the first study show participants were not able to recognize target emotions with high accuracy. Results from the second suggest a relationship between the in-situ motion characteristics of a robot and perceived affect. Results from the third suggest that expressions that combine in-situ motions with expressive lights were better able to convey many emotions but not all. We conclude that adding in-situ motions to affective expressive lights appears to be better able to help convey emotions. These findings are important for designing affective behaviors for future utility robots that need to possess certain social abilities.

References

[1]
Cindy L. Bethel and Robin R. Murphy. 2008. Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 38, 1 (Jan. 2008), 83--92.
[2]
Thomas Fincannon, Laura Barnes, Murphy Robin R., and Dawn Riddle. 2004. Evidence of the need for social intelligence in rescue robots. In Proceedings of the 14th IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2. 1089--1095.
[3]
Robin R. Murphy, Dawn Riddle, and Eric Rasmussen. 2004. Robot-assisted medical reachback: a survey of how medical personnel expect to interact with rescue robots. In Proceedings of the 13th IEEE International Workshop on Robot and Human Interactive Communication. 301--306.
[4]
Martin Saerbeck and Christoph Bartneck. 2010. Perception of affect elicited by robot motion. In Proceedings of the 5th ACM/IEEE International Conference on Human-Robot Interaction. 53--60.
[5]
Paul Strohmeier, Juan Pablo Carrascal, Bernard Cheng, Margaret Meban, and Roel Vertegaal. 2016. An evaluation of shape changes for conveying emotions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3781--3792.
[6]
Jessica Rebecca Cauchard, Kevin Y. Zhai, Marco Spadafora, and James A. Landay. 2016. Emotion encoding in human-drone interaction. In The Eleventh ACM/IEEE International Conference on Human Robot Interaction. IEEE Press, Piscataway, NJ, USA, 263--270. http://dl.acm.org/citation.cfm?id=2906831.2906878
[7]
Patrice D. Tremoulet and Jacob Feldman. Perception of animacy from the motion of a single object. Perception 29.8 (2000): 943--951.
[8]
Kazunori Terada, Atsushi Yamauchi, and Akira Ito. 2012. Artificial emotion expression for a robot by dynamic color change. In Proceedings of the 21st IEEE International Symposium on Robot and Human Interactive Communication. 314--321.
[9]
Sichao Song and Seiji Yamada. 2017. Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI '17). ACM, New York, NY, USA, 2--11.
[10]
Andrew J. Elliot, Mark D. Fairchild, and Anna Franklin. 2015. Handbook of Color Psychology. Cambridge University Press.
[11]
James A. Russell. 1980. A circumplex model of affect. Journal of Personality and Social Psychology 39, 6, 1161.
[12]
Peter J. Lang, Margaret M. Bradley, and Bruce N. Cuthbert. 2008. International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8.
[13]
Paul Ekman and Wallace V. Friesen. 1975. Unmasking the face: A guide to recognizing emotions from facial cues.
[14]
Tadas Baltruvsaitis, Laurel D. Riek, and Peter Robinson. 2010. Synthesizing expressions using facial feature point tracking: How emotion is conveyed. In Proceedings of the 3rd International Workshop on Affective Interaction in Natural Environments (AFFINE '10). ACM, New York, NY, USA, 27--32.
[15]
Lola D. Canamero and Jakob Fredslund. I show you how I like you - can you read it in my face? In IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, vol. 31, no. 5, pp. 454--459, Sep 2001.
[16]
Daniel J. Rea, James E. Young, and Pourang Irani. 2012. The Roomba mood ring: an ambient-display robot. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction (HRI '12). ACM, New York, NY, USA, 217--218.
[17]
D. S. Syrdal, K. L. Koay, M. Gácsi, M. L. Walters and K. Dautenhahn, "Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot," 19th International Symposium in Robot and Human Interactive Communication, Viareggio, 2010, pp. 632--637.
[18]
M. H"aring, N. Bee and E. André, "Creation and Evaluation of emotion expression with body movement, sound and eye color for humanoid robots," 2011 RO-MAN, Atlanta, GA, 2011, pp. 204--209.
[19]
Diana Löffler, Nina Schmidt, and Robert Tscharn. 2018. Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (HRI '18). ACM, New York, NY, USA, 334--343.

Cited By

View all
  • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
  • (2023)Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and ConsiderationsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638408(454-461)Online publication date: 2-Dec-2023
  • (2023)My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic EyesProceedings of the ACM on Human-Computer Interaction10.1145/36042617:MHCI(1-30)Online publication date: 13-Sep-2023
  • Show More Cited By

Index Terms

  1. Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HAI '18: Proceedings of the 6th International Conference on Human-Agent Interaction
    December 2018
    402 pages
    ISBN:9781450359535
    DOI:10.1145/3284432
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 December 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affective computing
    2. appearance-constrained robot
    3. emotion
    4. expressive lights
    5. human-robot interaction (hri)
    6. in-situ motion

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    HAI '18
    Sponsor:
    HAI '18: 6th International Conference on Human-Agent Interaction
    December 15 - 18, 2018
    Southampton, United Kingdom

    Acceptance Rates

    HAI '18 Paper Acceptance Rate 40 of 92 submissions, 43%;
    Overall Acceptance Rate 121 of 404 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)137
    • Downloads (Last 6 weeks)36
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)What Kinds of Facial Self-Touches Strengthen Expressed Emotions?2024 33rd IEEE International Conference on Robot and Human Interactive Communication (ROMAN)10.1109/RO-MAN60168.2024.10731424(446-452)Online publication date: 26-Aug-2024
    • (2023)Designing Emotional Expressions of Autonomous Vehicles for Communication with Pedestrians in Urban Shared Spaces: Use Cases, Modalities, and ConsiderationsProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638408(454-461)Online publication date: 2-Dec-2023
    • (2023)My Eyes Speak: Improving Perceived Sociability of Autonomous Vehicles in Shared Spaces Through Emotional Robotic EyesProceedings of the ACM on Human-Computer Interaction10.1145/36042617:MHCI(1-30)Online publication date: 13-Sep-2023
    • (2023)Modeling Adaptive Expression of Robot Learning Engagement and Exploring Its Effects on Human TeachersACM Transactions on Computer-Human Interaction10.1145/357181330:5(1-48)Online publication date: 23-Sep-2023
    • (2023)Investigating the Role of Multi-modal Social Cues in Human-Robot Collaboration in Industrial SettingsInternational Journal of Social Robotics10.1007/s12369-023-01018-915:7(1169-1179)Online publication date: 11-Jun-2023
    • (2023)How Should Your Assistive Robot Look Like? A Scoping Review on Embodiment for Assistive RobotsJournal of Intelligent & Robotic Systems10.1007/s10846-022-01781-3107:1Online publication date: 16-Jan-2023
    • (2022)Alexa Feels Blue And so Do I? Conversational Agents Displaying Emotions via Light Modalities2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)10.1109/RO-MAN53752.2022.9900838(415-420)Online publication date: 29-Aug-2022
    • (2022)Analysis of impressions of robot by changing its motion and trajectory parameters for designing parameterized behaviors of home-service robotsIntelligent Service Robotics10.1007/s11370-022-00447-1Online publication date: 8-Nov-2022
    • (2021)How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid RobotsMultimodal Technologies and Interaction10.3390/mti51200845:12(84)Online publication date: 20-Dec-2021
    • (2021)Impression Evaluation of Interaction Among Robots based on Collision Avoidance BehaviorsProceedings of the 9th International Conference on Human-Agent Interaction10.1145/3472307.3484659(272-276)Online publication date: 9-Nov-2021
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media