Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2909824.3020216acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
research-article
Public Access

Using Facially Expressive Robots to Calibrate Clinical Pain Perception

Published: 06 March 2017 Publication History

Abstract

In this paper, we introduce a novel application of social robotics in healthcare: high fidelity, facially expressive, robotic patient simulators (RPSs), and explore their usage within a clinical experimental context. Current commercially-available RPSs, the most commonly used humanoid robots worldwide, are substantially limited in their usability and fidelity due to the fact that they lack one of the most important clinical interaction and diagnostic tools: an expressive face. Using autonomous facial synthesis techniques, we synthesized pain both on a humanoid robot and comparable virtual avatar. We conducted an experiment with 51 clinicians and 51 laypersons (n = 102), to explore differences in pain perception across the two groups, and also to explore the effects of embodiment (robot or avatar) on pain perception. Our results suggest that clinicians have lower overall accuracy in detecting synthesized pain in comparison to lay participants. We also found that all participants are overall less accurate detecting pain from a humanoid robot in comparison to a comparable virtual avatar, lending support to other recent findings in the HRI community. This research ultimately reveals new insights into the use of RPSs as a training tool for calibrating clinicians' pain detection skills.

References

[1]
S. Andrist, X. Z. Tan, M. Gleicher, and B. Mutlu. Conversational gaze aversion for humanlike robots. Proceedings of the 2014 ACM/IEEE international Conference on Human-Robot Interaction, pages 25--32, 2014.
[2]
A. B. Ashraf, S. Lucey, J. F. Cohn, T. Chen, Z. Ambadar, K. M. Prkachin, and P. E. Solomon. The painful face--pain expression recognition using active appearance models. Image and Vision Computing, 27(12):1788--1796, 2009.
[3]
A. L. Back and et al. Efficacy of communication skills training for giving bad news and discussing transitions to palliative care. Archives of Internal Medicine, 167(5), 2007.
[4]
T. Baltrusaitis, L. D. Riek, and P. Robinson. Synthesizing expressions using facial feature point tracking: how emotion is conveyed. In Proceedings of the 3rd ACM international workshop on Affective interaction in natural environments, pages 27--32, 2010.
[5]
T. Baltrusaitis, P. Robinson, and L. Morency. 3D constrained local model for rigid and non-rigid facial tracking. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pages 2610--2617, 2012.
[6]
S. Baron-Cohen, A. Cox, G. Baird, J. Swettenham, N. Nightingale, K. Morgan, A. Drew, and T. Charman. Psychological markers in the detection of autism in infancy in a large population. The British Journal of Psychiatry, 168(2):158--163, 1996.
[7]
C. Bartneck, J. Reichenbach, and v. A. Breemen. In your face, robot! the influence of a character's embodiment on how users perceive its emotional expressions. In Proceedings of the Design and Emotion, pages 32--51, 2004.
[8]
D. Bazo, R. Vaidyanathan, A. Lentz, and C. Melhuish. Design and testing of a hybrid expressive face for a humanoid robot. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010.
[9]
C. C. Bennett and S. Sabanović. Deriving minimal features for human-like facial expressions in robotic faces. International Journal of Social Robotics, 6(3):367--381, 2014.
[10]
K. Berns and J. Hirth. Control of facial expressions of the humanoid robot head roman. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2006.
[11]
D. Blanch-Hartigan, S. A. Andrzejewski, and K. M. Hill. The effectiveness of training to improve person perception accuracy: a meta-analysis. Basic and Applied Social Psychology, 34(6):483--498, 2012.
[12]
C. Breazeal, C. D. Kidd, A. L. Thomaz, G. Hoffman, and M. Berlin. Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In IEEE/RSJ International Conference on Intelligent Robots and Systems, pages 708--713, 2005.
[13]
P. Briggs, M. Scheutz, and L. Tickle-Degnen. Are robots ready for administering health status surveys': First results from an hri study with subjects with parkinson's disease. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, pages 327--334, 2015.
[14]
J. Brown. How clinical communication has become a core part of medical education in the UK. Medical Education, 42(3), 2008.
[15]
A. J. Card. Patient safety: this is public health. Journal of Healthcare Risk Management, 34(1):6--12, 2014.
[16]
S. W. Chew, P. Lucey, S. Lucey, J. Saragih, J. F. Cohn, and S. Sridharan. Person-independent facial expression detection using constrained local models. In IEEE International Conference on Automatic Face and Gesture Recognition, 2011.
[17]
K. D. Craig. The social communication model of pain. Canadian Psychology/Psychologie canadienne, 50(1):22, 2009.
[18]
D. Cristinacce and T. F. Cootes. Feature detection and tracking with constrained local models. In British Machine Vision Conference (BMVC), volume 17, pages 929--938, 2006.
[19]
S. K. Das. Realistic interaction with social robots via facial expressions and neck-eye coordination. Master's thesis, The University of Texas at Arlington, USA, 2015.
[20]
A. Foster, N. Chaudhary, T. Kim, J. L. Waller, J. Wong, M. Borish, A. Cordar, B. Lok, and P. F. Buckley. Using virtual patients to teach empathy: A randomized controlled study to enhance medical students' empathic communication. Simulation in Healthcare, 11(3):181--189, 2016.
[21]
A. J. Giannini, J. D. Giannini, and R. K. Bowman. Measurement of nonverbal receptive abilities in medical students. Perceptual and Motor Skills, 90(3c):1145--1150, 2000.
[22]
M. J. Gonzales, J. M. Henry, A. W. Calhoun, and L. D. Riek. Visual task: A collaborative cognitive aid for acute care resuscitation. 10th EAI International Conference on Pervasive Computing Technologies for Healthcare (Pervasive Health), pages 1--8, 2016.
[23]
P. Gulbrandsen, B. F. Jensen, A. Finset, and D. Blanch-Hartigan. Long-term effect of communication training on the relationship between physicians' self-efficacy and performance. Patient Education and Counseling, 91(2):180--185, 2013.
[24]
A. Habib, S. K. Das, I.-C. Bogdan, D. Hanson, and D. O. Popa. Learning human-like facial expressions for android phillip k. dick. In IEEE International Conference on Automation Science and Engineering (CASE), pages 1159--1165, 2014.
[25]
T. Hadjistavropoulos, K. D. Craig, and S. Fuchs-Lacelle. Social influences and the communication of pain. Pain: Psychological Perspectives, 2004.
[26]
D. Hanson. Exploring the aesthetic range for humanoid robots. In Proceedings of the ICCS/CogSci-2006 Long Symposium: Toward Social Mechanisms of Android Science, pages 39--42. Citeseer.
[27]
M. Hojat, M. J. Vergare, K. Maxwell, G. Brainard, S. K. Herrine, G. A. Isenberg, J. Veloski, and J. S. Gonnella. The devil is in the third year: a longitudinal study of erosion of empathy in medical school. Academic Medicine, 84(9):1182--1191, 2009.
[28]
J. T. James. A new, evidence-based estimate of patient harms associated with hospital care. Journal of patient safety, 9(3), 2013.
[29]
A. Janiw, L. Woodrick, and L. Riek. Patient situational awareness support appears to fall with advancing levels of nursing student education (submission\# 968). Simulation in Healthcare, 8(6):345, 2013.
[30]
J. Jansen, J. C. van Weert, J. de Groot, S. van Dulmen, T. J. Heeren, and J. M. Bensing. Emotional and informational patient cues: the impact of nurses' responses on recall. Patient Education and Counseling, 79(2):218--224, 2010.
[31]
C. D. Kidd. Phd thesis:sociable robots: The role of presence and task in human-robot interaction. Citeseer, 2003.
[32]
S. Kiesler, A. Powers, S. R. Fussell, and C. Torrey. Anthropomorphic interactions with a robot and robot-like agent. Social Cognition, 26(2):169, 2008.
[33]
T. Kishi, T. Otani, N. Endo, P. Kryczka, K. Hashimoto, K. Nakata, and A. Takanishi. Development of expressive robotic head for bipedal humanoid robot. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 4584--4589, 2012.
[34]
M. Leonard. The human factor: the critical importance of effective teamwork and communication in providing safe care. Quality and Safety in Health Care, 13, 2004.
[35]
W. Levinson, R. Gorawara-Bhat, and J. Lamb. A study of patient clues and physician responses in primary care and surgical settings. Jama, 284(8):1021--1027, 2000.
[36]
A. Li, M. Florendo, L. Miller, H. Ishiguro, and A. P. Saygin. Robot form and motion influences social attention. In ACM/IEEE International Conference on Human-Robot Interaction, 2015.
[37]
J. Li, R. Kizilcec, J. Bailenson, and W. Ju. Social robots and virtual agents as lecturers for video instruction. Computers in Human Behavior, 55:1222--1230, 2016.
[38]
P. Lucey, J. F. Cohn, K. M. Prkachin, P. E. Solomon, and I. Matthews. Painful data: The unbc-mcmaster shoulder pain expression archive database. In IEEE International Conference on Automatic Face & Gesture Recognition, 2011.
[39]
P. Lucey and et al. Automatically detecting pain using facial actions. In 3rd International Conference on Affective Computing and Intelligent Interaction (ACII), 2009.
[40]
L. R. Martin and H. S. Friedman. Nonverbal communication and health care. Applications of Nonverbal Communication, 2005.
[41]
M. Moosaei, M. J. Gonzales, and L. D. Riek. Naturalistic pain synthesis for virtual patients. International Conference on Intelligent Virtual Agents (IVA), pages 295--309, 2014.
[42]
M. Moosaei, C. J. Hayes, and L. D. Riek. Performing facial expression synthesis on robot faces: A real-time software system. In Proceedings of the 4th International AISB Symposium on New Frontiers in Human-Robot Interaction, 2015.
[43]
D. S. Morse, E. A. Edwardsen, and H. S. Gordon. Missed opportunities for interval empathy in lung cancer communication. Archives of Internal Medicine, 168(17):1853--1858, 2008.
[44]
A. Paiva, J. Dias, D. Sobral, R. Aylett, P. Sobreperez, S. Woods, C. Zoll, and L. Hall. Caring for agents and agents that care: Building empathic relations with synthetic agents. In Proceedings of the Third International Joint Conference on Autonomous Agents and Multiagent Systems-Volume 1, pages 194--201, 2004.
[45]
M. Pantic, M. Valstar, R. Rademaker, and L. Maat. Web-based database for facial expression analysis. In IEEE International Conference on Multimedia and Expo (ICME), 2005.
[46]
K. M. Prkachin, S. Berzins, and S. R. Mercer. Encoding and decoding of pain expressions: a judgement study. Pain, 58(2), 1994.
[47]
K. M. Prkachin and K. D. Craig. Expressing pain: The communication and interpretation of facial pain signals. Journal of Nonverbal Behavior, 19(4), 1995.
[48]
K. M. Prkachin, P. E. Solomon, and J. Ross. Underestimation of pain by health-care providers: towards a model of the process of inferring pain in others. Canadian Journal of Nursing Research (CJNR), 39(2):88--106, 2007.
[49]
L. Riek. Healthcare robotics. Communications of the ACM, in review.
[50]
L. D. Riek. The social co-robotics problem space: Six key challenges. Proceedings of Robotics Challenges and Vision (RCV), 2013.
[51]
L. D. Riek. Robotics technology in mental health care. Artificial Intelligence in Behavioral and Mental Health Care, pages 185--203, 2015.
[52]
L. D. Riek, T.-C. Rabinowitch, P. Bremner, A. G. Pipe, M. Fraser, and P. Robinson. Cooperative gestures: Effective signaling for humanoid robots. In 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 61--68, 2010.
[53]
L. D. Riek and P. Robinson. Real-time empathy: Facial mimicry on a robot. In Workshop on Affective Interaction in Natural Environments (AFFINE) at the International ACM Conference on Multimodal Interfaces (ICMI 08), 2008.
[54]
L. D. Riek and P. Robinson. Using robots to help people habituate to visible disabilities. In IEEE International Conference on Rehabilitation Robotics, pages 1--8, 2011.
[55]
P. Riva, S. Sacchi, L. Montali, and A. Frigerio. Gender effects in pain detection: Speed and accuracy in decoding female and male pain expressions. European Journal of Pain, 2011.
[56]
K. Ruhland, C. E. Peters, S. Andrist, J. B. Badler, N. I. Badler, M. Gleicher, B. Mutlu, and R. McDonnell. A review of eye gaze in virtual agents, social robotics and hci: Behaviour generation, user interaction and perception. Computer Graphics Forum, 34(6):299--326, 2015.
[57]
J. A. Russell. Is there universal recognition of emotion from facial expressions' a review of the cross-cultural studies. Psychological bulletin, 115(1), 1994.
[58]
J. M. Satterfield and E. Hughes. Emotion skills training for medical students: a systematic review. Medical education, 41(10):935--941, 2007.
[59]
Y. Tadesse, S. Priya, H. Stephanou, D. Popa, and D. Hanson. Piezoelectric actuation and sensing for facial robotics. Ferroelectrics, 345(1):13--25, 2006.
[60]
L. Tickle-Degnen, M. Scheutz, and R. C. Arkin. Collaborative robots in rehabilitation for social self-management of health. Proceedings of RESNA 2014, 2014.
[61]
N. Tottenham and et al. The nimstim set of facial expressions: judgments from untrained research participants. Psychiatry Research, 168(3), 2009.
[62]
Valve Software: Source SDK. source.valvesoftware.com/sourcesdk.php.
[63]
M. L. Walters, M. Lohse, M. Hanheide, B. Wrede, D. S. Syrdal, K. L. Koay, A. Green, H. Hüttenrauch, K. Dautenhahn, G. Sagerer, et al. Evaluating the robot personality and verbal behavior of domestic robots using video-based studies. Advanced Robotics, 25(18):2233--2254, 2011.
[64]
L. D. Wandner, J. E. Letzen, C. A. Torres, B. Lok, and M. E. Robinson. Using virtual human technology to provide immediate feedback about participants' use of demographic cues and knowledge of their cue use. The Journal of Pain, 15(11):1141 -- 1147, 2014.
[65]
A. Weiss. Creating service robots for and with people: A user-centered reflection on the interdisciplinary research field of human-robot interaction. In 15th Annual STS Conference Graz, Critical Issues in Science, Technology, and Society Studies, 2016.
[66]
A. C. d. C. Williams, H. T. O. Davies, and Y. Chadury. Simple pain rating scales hide complex idiosyncratic meanings. Pain, 85(3), 2000.
[67]
S. N. Woods, M. L. Walters, K. L. Koay, and K. Dautenhahn. Methodological issues in hri: A comparison of live and video-based methods in robot to human approach direction trials. In The 15th IEEE International Symposium on Robot and Human Interactive Communication (ROMAN), pages 51--58, 2006.
[68]
X. Zhang, L. Yin, J. F. Cohn, S. Canavan, M. Reale, A. Horowitz, P. Liu, and J. M. Girard. BP4D-spontaneous: a high-resolution spontaneous 3D dynamic facial expression database. Image and Vision Computing, 32(10):692--706, 2014.
[69]
R. Zhao, T. Sinha, A. Black, and J. Cassell. Socially-aware virtual agents: Automatically assessing dyadic rapport from temporal patterns of behavior. In 16th International Conference on Intelligent Virtual Agents, 2016.
[70]
C. Zimmermann, L. Del Piccolo, and A. Finset. Cues and concerns by patients in medical consultations: a literature review. Psychological bulletin, 133(3):438, 2007.

Cited By

View all
  • (2024)The Evolution From Standardized to Virtual Patients in Medical EducationCureus10.7759/cureus.71224Online publication date: 10-Oct-2024
  • (2023)Artificial intelligence technologies and compassion in healthcare: A systematic scoping reviewFrontiers in Psychology10.3389/fpsyg.2022.97104413Online publication date: 17-Jan-2023
  • (2023)Not All Robots are Evaluated Equally: The Impact of Morphological Features on Robots’ Assessment through Capability AttributionsACM Transactions on Human-Robot Interaction10.1145/354953212:1(1-31)Online publication date: 15-Feb-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
HRI '17: Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction
March 2017
510 pages
ISBN:9781450343367
DOI:10.1145/2909824
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 March 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. affective communication
  2. emotion
  3. facial expressions
  4. health informatics
  5. human robot interaction
  6. humanoid robots
  7. medical technologies
  8. non-verbal communication
  9. patient simulation
  10. social robots

Qualifiers

  • Research-article

Funding Sources

Conference

HRI '17
Sponsor:

Acceptance Rates

HRI '17 Paper Acceptance Rate 51 of 211 submissions, 24%;
Overall Acceptance Rate 268 of 1,124 submissions, 24%

Upcoming Conference

HRI '25
ACM/IEEE International Conference on Human-Robot Interaction
March 4 - 6, 2025
Melbourne , VIC , Australia

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)161
  • Downloads (Last 6 weeks)29
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)The Evolution From Standardized to Virtual Patients in Medical EducationCureus10.7759/cureus.71224Online publication date: 10-Oct-2024
  • (2023)Artificial intelligence technologies and compassion in healthcare: A systematic scoping reviewFrontiers in Psychology10.3389/fpsyg.2022.97104413Online publication date: 17-Jan-2023
  • (2023)Not All Robots are Evaluated Equally: The Impact of Morphological Features on Robots’ Assessment through Capability AttributionsACM Transactions on Human-Robot Interaction10.1145/354953212:1(1-31)Online publication date: 15-Feb-2023
  • (2023)Conception of a Humanoid-Robot-Patient in Education to Train and Practice2023 IEEE 2nd German Education Conference (GECon)10.1109/GECon58119.2023.10295118(1-5)Online publication date: 2-Aug-2023
  • (2022)Non-Dyadic Interaction: A Literature Review of 15 Years of Human-Robot Interaction Conference PublicationsACM Transactions on Human-Robot Interaction10.1145/348824211:2(1-32)Online publication date: 8-Feb-2022
  • (2022)Facial Expression Modeling and Synthesis for Patient Simulator Systems: Past, Present, and FutureACM Transactions on Computing for Healthcare10.1145/34835983:2(1-32)Online publication date: 3-Mar-2022
  • (2022)The biopsychosociotechnical model: a systems-based framework for human-centered health improvementHealth Systems10.1080/20476965.2022.202958412:4(387-407)Online publication date: 30-Jan-2022
  • (2022)More than surgical tools: a systematic review of robots as didactic tools for the education of professionals in health sciencesAdvances in Health Sciences Education10.1007/s10459-022-10118-627:4(1139-1176)Online publication date: 30-Jun-2022
  • (2021)Survey of Emotions in Human–Robot Interactions: Perspectives from Robotic Psychology on 20 Years of ResearchInternational Journal of Social Robotics10.1007/s12369-021-00778-614:2(389-411)Online publication date: 4-Jun-2021
  • (2020)

    Home-Based Cognitively Assistive Robots: Maximizing Cognitive Functioning and Maintaining Independence in Older Adults Without Dementia

    Clinical Interventions in Aging10.2147/CIA.S253236Volume 15(1129-1139)Online publication date: Jul-2020
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media