Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3025453.3025774acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Sketching CuddleBits: Coupled Prototyping of Body and Behaviour for an Affective Robot Pet

Published: 02 May 2017 Publication History

Abstract

Social robots that physically display emotion invite natural communication with their human interlocutors, enabling applications like robot-assisted therapy where a complex robot's breathing influences human emotional and physiological state. Using DIY fabrication and assembly, we explore how simple 1-DOF robots can express affect with economy and user customizability, leveraging open-source designs.
We developed low-cost techniques for coupled iteration of a simple robot's body and behaviour, and evaluated its potential to display emotion. Through two user studies, we (1) validated these CuddleBits' ability to express emotions (N=20); (2) sourced a corpus of 72 robot emotion behaviours from participants (N=10); and (3) analyzed it to link underlying parameters to emotional perception (N=14).
We found that CuddleBits can express arousal (activation), and to a lesser degree valence (pleasantness). We also show how a sketch-refine paradigm combined with DIY fabrication and novel input methods enable parametric design of physical emotion display, and discuss how mastering this parsimonious case can give insight into layering simple behaviours in more complex robots.

Supplementary Material

ZIP File (pn2576-file4.zip)
suppl.mov (pn2576-file3.mp4)
Supplemental video
suppl.mov (pn2576p.mp4)
Supplemental video
MP4 File (p3681-bucci.mp4)

References

[1]
Jeff Allen, Laura Cang, Michael Phan-Ba, Andrew Strang, and Karon MacLean. 2015. Introducing the Cuddlebot: A Robot that Responds to Touch Gestures. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts. ACM, 295--295.
[2]
Lisa Feldman Barrett, Batja Mesquita, and Maria Gendron. 2011. Context in emotion perception. Current Directions in Psychological Science 20, 5 (2011), 286--290.
[3]
Susana Bloch, Madeleine Lemeignan, and Nancy Aguilera-T. 1991. Specific respiratory patterns distinguish among human basic emotions. International Journal of Psychophysiology 11, 2 (1991), 141--154.
[4]
Frans A Boiten, Nico H Frijda, and Cornelis JE Wientjes. 1994. Emotions and respiratory patterns: review and critical analysis. International Journal of Psychophysiology 17, 2 (1994), 103--128.
[5]
Paul Bucci, X. Laura Cang, Matthew Chun, David Marino, Oliver Schnieder, Hasti Seifi, and Karon E. MacLean. 2016. CuddleBits: an iterative prototyping platform for complex haptic display. Eurohaptics Demonstration (2016).
[6]
Laura Cang, Paul Bucci, and Karon E MacLean. 2015. CuddleBits: Friendly, Low-cost Furballs that Respond to Touch. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction. ACM, 365--366.
[7]
Jessica Q Dawson, Oliver S Schneider, Joel Ferstay, Dereck Toker, Juliette Link, Shathel Haddad, and Karon MacLean. 2013. It's alive!: exploring the design space of a gesturing phone. In Proc of Graphics Interface 2013. Canadian Information Processing Society, 205--212.
[8]
Celso M de Melo, Patrick Kenny, and Jonathan Gratch. 2010. Real-time expression of affect through respiration. Computer Animation and Virtual Worlds 21, 3--4 (2010), 225--234.
[9]
Lisa Feldman Barrett and James A Russell. 1998. Independence and bipolarity in the structure of current affect. Journal of personality and social psychology 74, 4 (1998), 967.
[10]
Terrence Fong, Illah Nourbakhsh, and Kerstin Dautenhahn. 2003. A survey of socially interactive robots. Robotics and Autonomous Systems 42, 3 (2003), 143--166.
[11]
Jonas Forsslund, Michael Yip, and Eva-Lotta Sallnäs. 2015. WoodenHaptics: A Starting Kit for Crafting Force-Reflecting Spatial Haptic Devices. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction. ACM, 133--140.
[12]
John Harris and Ehud Sharlin. 2011. Exploring the affect of abstract motion in social human-robot interaction. In 2011 Ro-Man. IEEE, 441--448.
[13]
Fritz Heider and Marianne Simmel. 1944. An experimental study of apparent behavior. The American Journal of Psychology 57, 2 (1944), 243--259.
[14]
Tanja Hofer, Petra Hauf, and Gisa Aschersleben. 2007. Infants' perception of goal-directed actions on video. British Journal of Developmental Psychology 25, 3 (2007), 485--498.
[15]
Guy Hoffman and Wendy Ju. 2014. Designing robots with movement in mind. Journal of Human-Robot Interaction 3, 1 (2014), 89--122.
[16]
Youn-Kyung Lim, Erik Stolterman, and Josh Tenenberg. 2008. The anatomy of prototypes: Prototypes as filters, prototypes as manifestations of design ideas. ACM TOCHI 15, 2 (2008), 7.
[17]
C. Lithari, C. A. Frantzidis, C. Papadelis, Ana B. Vivas, M. A. Klados, C. Kourtidou-Papadeli, C. Pappas, A. A. Ioannides, and P. D. Bamidis. 2010. Are Females More Responsive to Emotional Stimuli? A Neurophysiological Study Across Arousal and Valence Dimensions. Brain Topography 23, 1 (2010), 27--40.
[18]
Lars Mathiassen, Thomas Seewaldt, and Jan Stage. 1995. Prototyping and specifying: principles and practices of a mixed approach. Scandinavian Journal of Information Systems 7, 1 (1995), 4.
[19]
Camille Moussette. 2012. Simple haptics: Sketching perspectives for the design of haptic interactions. (2012).
[20]
Camille Moussette and Fabricio Dore. 2010. Sketching in Hardware and Building Interaction Design: tools, toolkits and an attitude for Interaction Designers. In Design Research Society.
[21]
S Oliver and M Karon. 2014. Haptic jazz: Collaborative touch with the haptic instrument. In IEEE Haptics Symposium.
[22]
Pierre Rainville, Antoine Bechara, Nasir Naqvi, and Antonio R Damasio. 2006. Basic emotions are associated with distinct patterns of cardiorespiratory activity. International journal of psychophysiology 61, 1 (2006), 5--18.
[23]
Jussi Rantala, Katri Salminen, Roope Raisamo, and Veikko Surakka. 2013. Touch gestures in communicating emotional intention via vibrotactile stimulation. Intl J Human-Computer Studies 71, 6 (2013), 679--690.
[24]
Tiago Ribeiro and Ana Paiva. 2012. The illusion of robotic life: principles and practices of animation for robots. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. ACM, 383--390.
[25]
James A Russell. 1980. A circumplex model of affect. Journal of Personality and Social Psychology 39, 6 (1980), 1161.
[26]
James A Russell, Anna Weiss, and Gerald A Mendelsohn. 1989. Affect grid: a single-item scale of pleasure and arousal. J Personality & Social Psychology 57, 3 (1989).
[27]
Jelle Saldien, Kristof Goris, Selma Yilmazyildiz, Werner Verhelst, and Dirk Lefeber. 2008. On the design of the huggable robot Probo. Journal of Physical Agents 2, 2 (2008), 3--12.
[28]
Oliver S Schneider and Karon E MacLean. 2016. Studying design process and example use with Macaron, a web-based vibrotactile effect editor. In 2016 IEEE Haptics Symposium (HAPTICS). IEEE, 52--58.
[29]
Oliver S Schneider, Hasti Seifi, Salma Kashani, Matthew Chun, and Karon E MacLean. 2016. HapTurk: Crowdsourcing Affective Ratings of Vibrotactile Icons. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 3248--3260.
[30]
Yasaman S Sefidgar, Karon E MacLean, Steve Yohanan, HF Machiel Van der Loos, Elizabeth A Croft, and E Jane Garland. 2016. Design and Evaluation of a Touch-Centered Calming Interaction with a Social Robot. IEEE Transactions on Affective Computing 7, 2 (2016), 108--121.
[31]
Michael Shaver and Karon Maclean. 2003. The Twiddler: A haptic teaching tool for low-cost communication and mechanical design. Master's thesis.
[32]
Sotaro Shimada and Kazuo Hiraki. 2006. Infant's brain responses to live and televised action. Neuroimage 32, 2 (2006), 930--939.
[33]
Walter Dan Stiehl, Jun Ki Lee, Cynthia Breazeal, Marco Nalin, Angelica Morandi, and Alberto Sanna. 2009. The huggable: a platform for research in robotic companions for pediatric care. In Proceedings of the 8th International Conference on interaction Design and Children. ACM, 317--320.
[34]
Kazuyoshi Wada and Takanori Shibata. 2007. Living with seal robots--its sociopsychological and physiological influences on the elderly at a care house. IEEE Transactions on Robotics 23, 5 (2007), 972--980.
[35]
Kazuyoshi Wada, Takanori Shibata, Takashi Asada, and Toshimitsu Musha. 2007. Robot therapy for prevention of dementia at home. Journal of Robotics and Mechatronics 19, 6 (2007), 691.
[36]
David Watson, Lee A. Clark, and Auke Tellegen. 1988. Development and validation of brief measures of positive and negative affect: The PANAS scales. Journal of Personality and Social Psychology 54, 6 (1988), 1063--1070.
[37]
Steve Yohanan and Karon MacLean. 2009. A tool to study affective touch. In CHI'09 Extended Abstracts. ACM, 4153--4158.
[38]
Steve Yohanan and Karon MacLean. 2011. Design and assessment of the haptic creature's affect display. In HRI'11. ACM, 473--480.
[39]
Steve Yohanan and Karon E MacLean. 2012. The role of affective touch in human-robot interaction: Human intent and expectations in touching the haptic creature. Intl J Social Robotics 4, 2 (2012), 163--180.

Cited By

View all
  • (2024)Tangible Affect: A Literature Review of Tangible Interactive Systems Addressing Human Core Affect, Emotions and MoodsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661608(424-440)Online publication date: 1-Jul-2024
  • (2024)Designing Plant-Driven Actuators for Robots to Grow, Age, and DecayProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661519(2481-2496)Online publication date: 1-Jul-2024
  • (2024)breatHaptics: Enabling Granular Rendering of Breath Signals via Haptics using Shape-Changing Soft InterfacesProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633372(1-11)Online publication date: 11-Feb-2024
  • Show More Cited By

Index Terms

  1. Sketching CuddleBits: Coupled Prototyping of Body and Behaviour for an Affective Robot Pet

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
    May 2017
    7138 pages
    ISBN:9781450346559
    DOI:10.1145/3025453
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 May 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affective computing
    2. do-it-yourself (DIY)
    3. haptics
    4. human-robot interaction (HRI)
    5. physical prototyping

    Qualifiers

    • Research-article

    Funding Sources

    • NSERC

    Conference

    CHI '17
    Sponsor:

    Acceptance Rates

    CHI '17 Paper Acceptance Rate 600 of 2,400 submissions, 25%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)100
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 09 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Tangible Affect: A Literature Review of Tangible Interactive Systems Addressing Human Core Affect, Emotions and MoodsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661608(424-440)Online publication date: 1-Jul-2024
    • (2024)Designing Plant-Driven Actuators for Robots to Grow, Age, and DecayProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661519(2481-2496)Online publication date: 1-Jul-2024
    • (2024)breatHaptics: Enabling Granular Rendering of Breath Signals via Haptics using Shape-Changing Soft InterfacesProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3633372(1-11)Online publication date: 11-Feb-2024
    • (2024)Propensity to trust shapes perceptions of comforting touch between trustworthy human and robot partnersScientific Reports10.1038/s41598-024-57582-114:1Online publication date: 21-Mar-2024
    • (2024)Social touch to build trustComputers in Human Behavior10.1016/j.chb.2023.108121153:COnline publication date: 12-Apr-2024
    • (2024)Robot Pets as “Serious Toys”- Activating Social and Emotional Experiences of Elderly PeopleInformation Systems Frontiers10.1007/s10796-021-10175-z26:1(25-39)Online publication date: 1-Feb-2024
    • (2023)Demonstrating Virtual Teamwork with Synchrobots: A Robot-Mediated Approach to Improving ConnectednessExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583896(1-4)Online publication date: 19-Apr-2023
    • (2023)A Descriptive Analysis of a Formative Decade of Research in Affective Haptic System DesignProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580735(1-23)Online publication date: 19-Apr-2023
    • (2023)A Review of Affective Computing Research Based on Function-Component-Representation FrameworkIEEE Transactions on Affective Computing10.1109/TAFFC.2021.310451214:2(1655-1674)Online publication date: 1-Apr-2023
    • (2023)Discerning Affect From Touch and Gaze During Interaction With a Robot PetIEEE Transactions on Affective Computing10.1109/TAFFC.2021.309489414:2(1598-1612)Online publication date: 1-Apr-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media