Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-3-031-24667-8_23guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

When to Help? A Multimodal Architecture for Recognizing When a User Needs Help from a Social Robot

Published: 01 February 2023 Publication History

Abstract

It is important for socially assistive robots to be able to recognize when a user needs and wants help, and they must be able to do so in a real-time manner so that they can provide timely assistance. We propose an architecture that uses social cues to determine when a robot should provide assistance. Based on a multimodal fusion of eye gaze and language modalities, our architecture is trained and evaluated on data collected in a robot-assisted Lego building task. By focusing on social cues, our architecture has minimal dependencies on the specifics of a given task, enabling it to be applied in many different contexts. Enabling a social robot to recognize a user’s needs through social cues can help it to adapt to user behaviors and preferences, which in turn will lead to improved user experiences.

References

[1]
Langer A, Feingold-Polak R, Mueller O, Kellmeyer P, and Levy-Tzedek S Trust in socially assistive robots: considerations for use in rehabilitation Neurosci. Biobehav. Rev. 2019 104 231-239
[2]
Greczek, J.: Encouraging user autonomy through robot-mediated intervention. In: Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, pp. 189–190 (2015)
[3]
Wilson JR, Lee NY, Saechao A, Tickle-Degnen L, and Scheutz M Supporting human autonomy in a robot-assisted medication sorting task Int. J. Social Rob. 2018 10 5 621-641
[4]
Görür, O.C., Rosman, B.S., Hoffman, G., Albayrak, S.: Toward integrating theory of mind into adaptive decision-making of social robots to understand human intention. In: Workshop on Intentions in HRI at ACM/IEEE International Conference on Human-Robot Interaction (HRI 2017) (2017)
[5]
Sidner CL, Lee C, Kidd CD, Lesh N, and Rich C Explorations in engagement for humans and robots Artif. Intell. 2005 166 1–2 140-164
[6]
Bohus, D., Saw, C.W., Horvitz, E.: Directions robot: In-the-wild experiences and lessons learned. In: Proceedings of the 2014 International Conference on Autonomous Agents and Multi-Agent Systems. International Foundation for Autonomous Agents and Multiagent Systems, pp. 637–644 (2014)
[7]
Huang C-M, Andrist S, Sauppé A, and Mutlu B Using gaze patterns to predict task intent in collaboration Front. Psychol. 2015 6 1049
[8]
Matarić MJ, Eriksson J, Feil-Seifer DJ, and Winstein CJ Socially assistive robotics for post-stroke rehabilitation J. NeuroEng. Rehabil. 2007 4 1 1-9
[9]
Polak, R.F., Tzedek, S.L.: Social robot for rehabilitation: Expert clinicians and post-stroke patients’ evaluation following a long-term intervention. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 151–160 (2020)
[10]
Wilson JR, Tickle-Degnen L, and Scheutz M Challenges in designing a fully autonomous socially assistive robot for people with parkinson’s disease ACM Trans. Human-Robot Interact. (THRI) 2020 9 3 1-31
[11]
Kory-Westlund JM and Breazeal C A long-term study of young children’s rapport, social emulation, and language learning with a peer-like robot playmate in preschool Front. Rob. AI 2019 6 81
[12]
Fasola J and Matarić MJ A socially assistive robot exercise coach for the elderly J. Hum.-Robot Interact. 2013 2 2 3-32
[13]
Fischinger D Hobbit, a care robot supporting independent living at home: first prototype and lessons learned Rob. Auton. Syst. 2016 75 60-78
[14]
Wilson, J.R., Wransky, R., Tierno, J.: General approach to automatically generating need-based assistance. In: Proceedings of the Sixth Annual Conference on Advances in Cognitive Systems (2018)
[15]
Kurylo U, Wilson JR, et al. Salichs MA et al. Using human eye gaze patterns as indicators of need for assistance from a socially assistive robot Social Robotics 2019 Cham Springer 200-210
[16]
Shao, M., Alves, S.F.D.R., Ismail, O., Zhang, X., Nejat, G., Benhabib, B.: You are doing great! only one rep left: an affect-aware social robot for exercising. In: 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), pp. 3811–3817. IEEE (2019)
[17]
Graterol W, Diaz-Amado J, Cardinale Y, Dongo I, Lopes-Silva E, and Santos-Libarino C Emotion detection for social robots based on NLP transformers and an emotion ontology Sensors 2021 21 4 1322
[18]
Nau, D., Cao, Y., Lotem, A., Munoz-Avila, H.: Shop: simple hierarchical ordered planner. In: Proceedings of the 16th International Joint Conference on Artificial Intelligence, vol. 2, pp. 968–973 (1999)
[19]
Bohus, D., et al.: Platform for situated intelligence (2021)
[20]
Baltrušaitis, T., Robinson, P., Morency, L.-P.: Openface: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), pp. 1–10. IEEE, 2016
[21]
Csikszentmihalyi, M., Csikzentmihaly, M.: Flow: The Psychology of Optimal Experience, vol. 1990. Harper & Row New York (1990)
[22]
Rogers, J., Holm, M.: Performance assessment of self-care skills (pass-home) version 3.1. University of Pittsburgh, Pittsburgh (1994)

Cited By

View all
  • (2024)A Survey of Multimodal Perception Methods for Human–Robot Interaction in Social EnvironmentsACM Transactions on Human-Robot Interaction10.1145/365703013:4(1-50)Online publication date: 29-Apr-2024
  • (2024)Software Architecture to Generate Assistive Behaviors for Social RobotsCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640715(1119-1123)Online publication date: 11-Mar-2024

Index Terms

  1. When to Help? A Multimodal Architecture for Recognizing When a User Needs Help from a Social Robot
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image Guide Proceedings
          Social Robotics: 14th International Conference, ICSR 2022, Florence, Italy, December 13–16, 2022, Proceedings, Part I
          Dec 2022
          606 pages
          ISBN:978-3-031-24666-1
          DOI:10.1007/978-3-031-24667-8

          Publisher

          Springer-Verlag

          Berlin, Heidelberg

          Publication History

          Published: 01 February 2023

          Qualifiers

          • Article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 27 Jan 2025

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)A Survey of Multimodal Perception Methods for Human–Robot Interaction in Social EnvironmentsACM Transactions on Human-Robot Interaction10.1145/365703013:4(1-50)Online publication date: 29-Apr-2024
          • (2024)Software Architecture to Generate Assistive Behaviors for Social RobotsCompanion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610978.3640715(1119-1123)Online publication date: 11-Mar-2024

          View Options

          View options

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media