Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3546155.3546665acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Effects of Humanlikeness and Conversational Breakdown on Trust in Chatbots for Customer Service

Published: 08 October 2022 Publication History

Abstract

Trust in chatbots can be shaped by various factors such as humanlikeness in terms of visual appearance and conversational content, and conversational performance in terms of the chatbot’s ability to avoid conversational breakdown. The literature is inconclusive concerning the effect of humanlikeness and conversational performance on trust, especially their interaction effect. To examine the relations among these variables, we conducted a 2x3 (humanlikeness x conversational performance) factorial experiment with 251 participants, who were asked to perform three tasks with a chatbot for an online bank under one of the six conditions. Participants completed a questionnaire measuring trust and commented on trust factors. Results of between-group analysis showed that for the task with seeded breakdowns there were significant differences in trust across the six groups with the lowest ratings for the two groups experiencing breakdowns without repairs and that humanlikeness did not impact the extent to which the trust level changed. Results of within-group analysis showed significant differences in trust across the tasks but non-significant inter-task correlations on trust for the two groups. These observations challenge the effect of humanlikeness on trust while supporting the notion of trust resilience as the participants did not spill the impaired trust over the subsequent task. Thematic analysis showed that inter-group contrasts could be found for the theme ‘underlying functionality’ and ‘affective responses.’ Implications for research, practice and future work were drawn.

References

[1]
Martin Adam, Michael Wessel, and Alexander Benlian. 2021. AI-based Chatbots in Customer Service and their Effects on User Compliance. Electronic Markets 31, 2 (2021), 427–445.
[2]
Theo Araujo. 2018. Living up to the chatbot hype: The Influence of Anthropomorphic Design Cues and Communicative Agency Framing on Conversational Agent and Company Perceptions. Computers in Human Behavior 85 (2018), 183–189.
[3]
Zahra Ashktorab, Mohit Jain, Q Vera Liao, and Justin D Weisz. 2019. Resilient chatbots: Repair strategy preferences for conversational breakdowns. In Proceedings of the 2019 CHI conference on human factors in computing systems. 1–12.
[4]
Markus Blut, Cheng Wang, Nancy V Wünderlich, and Christian Brock. 2021. Understanding anthropomorphism in service provision: a meta-analysis of physical robots, chatbots, and other AI. Journal of the Academy of Marketing Science 49, 4 (2021), 632–658.
[5]
Petter Bae Brandtzaeg and Asbjørn Følstad. 2017. Why people use chatbots. In International conference on internet science. Springer, 377–392.
[6]
Virginia Braun and Victoria Clarke. 2012. Thematic analysis.In APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological.American Psychological Association, Washington, DC, US, 57–71. https://doi.org/10.1037/13620-004
[7]
Matthew Brzowski and Dan Nathan-Roberts. 2019. Trust measurement in human–automation interaction: A systematic review. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 63. SAGE Publications Sage CA: Los Angeles, CA, 1595–1599.
[8]
Leon Ciechanowski, Aleksandra Przegalinska, Mikolaj Magnuski, and Peter Gloor. 2019. In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems 92 (2019), 539–548.
[9]
Cynthia L Corritore, Beverly Kracher, and Susan Wiedenbeck. 2003. On-line trust: concepts, evolving themes, a model. International journal of human-computer studies 58, 6 (2003), 737–758.
[10]
Cammy Crolic, Felipe Thomaz, Rhonda Hadi, and Andrew T Stephen. 2022. Blame the bot: anthropomorphism and anger in customer–chatbot interactions. Journal of Marketing 86, 1 (2022), 132–148.
[11]
Robert Dale. 2016. The return of the chatbots. Natural Language Engineering 22, 5 (2016), 811–817.
[12]
Ewart J De Visser, Samuel S Monfort, Ryan McKendrick, Melissa AB Smith, Patrick E McKnight, Frank Krueger, and Raja Parasuraman. 2016. Almost human: Anthropomorphism increases trust resilience in cognitive agents.Journal of Experimental Psychology: Applied 22, 3 (2016), 331.
[13]
Diana Deibel and Rebecca Evanhoe. 2021. Conversations with Things: UX design for Chat and Voice. Rosenfeld Media.
[14]
Drift. 2018. The 2018 State of Chatbots Report. Technical Report. https://www.drift.com/blog/chatbots-report/
[15]
Jennifer R Dunn and Maurice E Schweitzer. 2005. Feeling and believing: the influence of emotion on trust.Journal of personality and social psychology 88, 5(2005), 736.
[16]
Asbjørn Følstad, Theo Araujo, Effie Lai-Chong Law, Petter Bae Brandtzaeg, Symeon Papadopoulos, Lea Reis, Marcos Baez, Guy Laban, Patrick McAllister, Carolin Ischen, 2021. Future directions for chatbot research: an interdisciplinary research agenda. Computing 103, 12 (2021), 2915–2942.
[17]
Asbjørn Følstad and Anders Mærøe. 2022. The Ethics of Chatbots in Public Sector Service Provision. In CUI@CHI2022: Ethics of Conversational User Interfaces. ACM, New York, NY, USA.
[18]
Asbjørn Følstad and Cameron Taylor. 2019. Conversational repair in chatbots for customer service: the effect of expressing uncertainty and suggesting alternatives. In International Workshop on Chatbot Research and Design. Springer, 201–214.
[19]
Asbjørn Følstad and Cameron Taylor. 2021. Investigating the user experience of customer service chatbot interaction: a framework for qualitative analysis of chatbot dialogues. Quality and User Experience 6, 1 (2021), 1–17.
[20]
Eun Go and S Shyam Sundar. 2019. Humanizing chatbots: The effects of visual, identity and conversational cues on humanness perceptions. Computers in Human Behavior 97 (2019), 304–316.
[21]
Jonathan Grudin and Richard Jacques. 2019. Chatbots, humbots, and the quest for artificial general intelligence. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 1–11.
[22]
Erika Hall. 2018. Conversational design. A Book Apart New York.
[23]
Adriana Hamacher, Nadia Bianchi-Berthouze, Anthony G Pipe, and Kerstin Eder. 2016. Believing in BERT: Using expressive communication to enhance trust and counteract operational error in physical Human-robot interaction. In 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, 493–500.
[24]
Peter A Hancock, Theresa T Kessler, Alexandra D Kaplan, John C Brill, and James L Szalma. 2021. Evolving trust in robots: specification through sequential and comparative meta-analyses. Human factors 63, 7 (2021), 1196–1229.
[25]
Isabel Kathleen Fornell Haugeland, Asbjørn Følstad, Cameron Taylor, and Cato Alexander Bjørkli. 2022. Understanding the user experience of customer service chatbots: An experimental study of chatbot interaction design. International Journal of Human-Computer Studies 161 (2022), 102788.
[26]
Liss Jenneboer, Carolina Herrando, and Efthymios Constantinides. 2022. The Impact of Chatbots on Customer Loyalty: A Systematic Literature Review. Journal of theoretical and applied electronic commerce research 17, 1(2022), 212–229.
[27]
Leslie Joseph. 2018. The Six Factors That Separate Hype From Hope In Your Conversational AI Journey. Technical Report. Forrester. 10 pages. https://www.forrester.com/report/The+Six+Factors+That+Separate+Hype+From+Hope+In+Your+Conversational+AI+Journey/-/E-RES143773
[28]
Knut Kvale, Eleonora Freddi, Stig Hodnebrog, Olav Alexander Sell, and Asbjørn Følstad. 2020. Understanding the user experience of customer service chatbots: what can we learn from customer satisfaction surveys?. In International Workshop on Chatbot Research and Design. Springer, 205–218.
[29]
Guy Laban and Theo Araujo. 2019. Working together with conversational agents: the relationship of perceived cooperation with service performance evaluations. In International Workshop on Chatbot Research and Design. Springer, 215–228.
[30]
Nancy K Lankton, D Harrison McKnight, and John Tripp. 2015. Technology, humanness, and trust: Rethinking trust in technology. Journal of the Association for Information Systems 16, 10 (2015), 1.
[31]
Effie Lai-Chong Law, Asbjørn Følstad, Jonathan Grudin, and Björn Schuller. 2022. Conversational Agent as Trustworthy Autonomous System (Trust-CA)(Dagstuhl Seminar 21381). In Dagstuhl Reports, Vol. 11. Schloss Dagstuhl-Leibniz-Zentrum für Informatik.
[32]
Min Kyung Lee, Sara Kiesler, Jodi Forlizzi, Siddhartha Srinivasa, and Paul Rybski. 2010. Gracefully mitigating breakdowns in robotic services. In 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI). IEEE, 203–210.
[33]
Sangwon Lee, Naeun Lee, and Young June Sah. 2020. Perceiving a mind in a chatbot: Effect of mind perception and social cues on co-presence, closeness, and intention to use. International Journal of Human–Computer Interaction 36, 10(2020), 930–940.
[34]
Catherine L Lortie and Matthieu J Guitton. 2011. Judgment of the humanness of an interlocutor is in the eye of the beholder. PLoS One 6, 9 (2011), e25085.
[35]
Brian Manusama, Bern Elliot, Magnus Revang, and Anthony Mullen. 2019. Market Guide for Virtual Customer Assistants. Technical Report ID G00349067. Gartner. 29 pages. https://www.gartner.com/en/documents/3947357/market-guide-for-virtual-customerassistants
[36]
Roger C Mayer, James H Davis, and F David Schoorman. 1995. An integrative model of organizational trust. Academy of management review 20, 3 (1995), 709–734.
[37]
D Harrison Mcknight, Michelle Carter, Jason Bennett Thatcher, and Paul F Clay. 2011. Trust in a specific technology: An investigation of its components and measures. ACM Transactions on management information systems (TMIS) 2, 2(2011), 1–25.
[38]
Michael McTear. 2020. Conversational ai: Dialogue systems, conversational agents, and chatbots. Synthesis Lectures on Human Language Technologies 13, 3(2020), 1–251.
[39]
Robert J Moore. 2018. A natural conversation framework for conversational UX design. In Studies in Conversational UX Design. Springer, 181–204.
[40]
Cecilie Bertinussen Nordheim, Asbjørn Følstad, and Cato Alexander Bjørkli. 2019. An initial model of trust in chatbots for customer service—findings from a questionnaire study. Interacting with Computers 31, 3 (2019), 317–335.
[41]
Aleksandra Przegalinska, Leon Ciechanowski, Anna Stroz, Peter Gloor, and Grzegorz Mazurek. 2019. In bot we trust: A new methodology of chatbot performance measures. Business Horizons 62, 6 (2019), 785–797.
[42]
Amon Rapp, Lorenzo Curti, and Arianna Boldi. 2021. The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human-Computer Studies 151 (2021), 102630.
[43]
Grand View Research. 2022. Chatbot Market Size, Share & Growth Report 2022-2030. Technical Report GVR-1-68038-598-4. Grand View Research. 132 pages. https://www.grandviewresearch.com/industry-analysis/chatbot-market
[44]
Magnus Revang and Anthony Mullen. 2022. Magic Quadrant for Enterprise Conversational AI Platforms. Technical Report ID G00748698. Gartner. 33 pages. https://www.gartner.com/doc/reprints?id=1-28HJXCBH&ct=211221&st=sb
[45]
Minjin Rheu, Ji Youn Shin, Wei Peng, and Jina Huh-Yoo. 2021. Systematic review: Trust-building factors and implications for conversational agent design. International Journal of Human–Computer Interaction 37, 1(2021), 81–96.
[46]
Torsten Ringberg, Gaby Odekerken-Schröder, and Glenn L Christensen. 2007. A cultural models approach to service recovery. Journal of Marketing 71, 3 (2007), 194–214.
[47]
Denise M Rousseau, Sim B Sitkin, Ronald S Burt, and Colin Camerer. 1998. Not so different after all: A cross-discipline view of trust. Academy of management review 23, 3 (1998), 393–404.
[48]
Scott Schanke, Gordon Burtch, and Gautam Ray. 2021. Estimating the impact of “humanizing” customer service chatbots. Information Systems Research 32, 3 (2021), 736–751.
[49]
Emanuel A Schegloff. 1991. Conversation analysis and socially shared cognition.In Socially Shared Cognition, Lauren B Resnick, John M Levine, and Stephanie D Teasley (Eds.). American Psychological Association, Washington, DC, US, 150–171.
[50]
Ryan M Schuetzler, Justin Scott Giboney, G Mark Grimes, and Jay F Nunamaker Jr. 2018. The influence of conversational agent embodiment and conversational relevance on socially desirable responding. Decision Support Systems 114 (2018), 94–102.
[51]
Amir Shevat. 2017. Designing bots: Creating conversational experiences. O’Reilly Media, Inc., Boston, US.
[52]
Statista. 2019. Share of consumers who have used chatbots to engage with companies in the United States as of 2019, by industry. Technical Report. Statista. https://www.statista.com/statistics/1042604/united-stated-share-internet-users-who-used-chatbots-industry/
[53]
Mark P Taylor, Kees Jacobs, KVJ Subrahmanyam, 2019. Smart talk: How organizations and consumers are embracing voice and chat assistants. Technical Report. Capgemini SE.
[54]
Joseph Weizenbaum. 1976. Computer power and human reason: From judgment to calculation.WH Freeman & Co, New York, NY, USA.
[55]
Beste F Yuksel, Penny Collisson, and Mary Czerwinski. 2017. Brains or beauty: How to engender trust in user-agent interactions. ACM Transactions on Internet Technology (TOIT) 17, 1 (2017), 1–20.
[56]
Juliana JY Zhang, Asbjørn Følstad, and Cato A Bjørkli. 2021. Organizational factors affecting successful implementation of chatbots for customer service. Journal of internet commerce(2021), 1–35.

Cited By

View all
  • (2024)Artificial intelligence features and expectation confirmation theory in digital banking apps: Gen Y and Z perspectiveManagement Decision10.1108/MD-07-2023-1145Online publication date: 5-Jul-2024
  • (2023)A Review Transformation Customer Service with Customer Relationship Management (CRM) Based Chatbots2023 International Workshop on Artificial Intelligence and Image Processing (IWAIIP)10.1109/IWAIIP58158.2023.10462818(225-230)Online publication date: 1-Dec-2023

Index Terms

  1. Effects of Humanlikeness and Conversational Breakdown on Trust in Chatbots for Customer Service

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '22: Nordic Human-Computer Interaction Conference
    October 2022
    1091 pages
    ISBN:9781450396998
    DOI:10.1145/3546155
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. breakdown
    2. chatbot
    3. conversational performance
    4. human-likeness
    5. repair
    6. trust

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    NordiCHI '22

    Acceptance Rates

    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)299
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 10 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Artificial intelligence features and expectation confirmation theory in digital banking apps: Gen Y and Z perspectiveManagement Decision10.1108/MD-07-2023-1145Online publication date: 5-Jul-2024
    • (2023)A Review Transformation Customer Service with Customer Relationship Management (CRM) Based Chatbots2023 International Workshop on Artificial Intelligence and Image Processing (IWAIIP)10.1109/IWAIIP58158.2023.10462818(225-230)Online publication date: 1-Dec-2023

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media