Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3411764.3445133acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Designing a Conversational Agent for Sexual Assault Survivors: Defining Burden of Self-Disclosure and Envisioning Survivor-Centered Solutions

Published: 07 May 2021 Publication History

Abstract

Sexual assault survivors hesitate to disclose their stories to others and even avoid case-reporting because of psychological, social, and cultural reasons. Thus, conversational agents (CAs) have gained much attention as a potential counselor because CAs’ characteristics (e.g., anonymity) could mitigate various difficulties of human-human interaction (HHI). Despite the potentials, it is difficult to design a CA for survivors because various aspects should be considered. Especially, with traditional HCI approaches only (e.g., need-finding and usability tests), designers could easily miss psychological and subjective burdens that survivors feel toward a new system. Hence, while envisioning a burden-free CA for survivors, we agilely designed and implemented an initial prototype CA (NamuBot) with professionals (the police and counselors). We then conducted a qualitative user study to identify and compare burdens caused by the CA vs. humans. Lastly, we codesigned design features that could reduce the CA-bound burdens with 36 participants (19 survivors and 17 professionals). Notably, our findings showed that 17 survivors preferred reporting their case to NamuBot over humans, expressing far less burden. Although CAs could also place burdens on survivors, the burdens could be alleviated by the features that the survivors and professionals designed. Finally, we present design implications and strategies to develop burden-mitigating CAs for survivors.

References

[1]
Alison Adam. 2005. Delegating and distributing morality: Can we inscribe privacy protection in a machine? Ethics and Information Technology 7, 4: 233–242.
[2]
Ramona Alaggia and Susan Wang. 2020. “I never told anyone until the# metoo movement”: What can we learn from sexual abuse and sexual assault disclosures made through social media? Child Abuse & Neglect 103: 104312.
[3]
Nazanin Andalibi, Oliver L. Haimson, Munmun De Choudhury, and Andrea Forte. 2018. Social Support, Reciprocity, and Anonymity in Responses to Sexual Abuse Disclosures on Social Media. ACM Transactions on Computer-Human Interaction (TOCHI) 25, 5: 28:1-28:35. https://doi.org/10.1145/3234942.
[4]
Nazanin Andalibi, Oliver L. Haimson, Munmun De Choudhury, and Andrea Forte. 2016. Understanding Social Media Disclosures of Sexual Abuse Through the Lenses of Support Seeking and Anonymity. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI ’16, 3906–3918. https://doi.org/10.1145/2858036.2858096.
[5]
Zahra Ashktorab, Mohit Jain, Q. Vera Liao, and Justin D. Weisz. 2019. Resilient Chatbots: Repair Strategy Preferences for Conversational Breakdowns. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–12. https://doi.org/10.1145/3290605.3300484.
[6]
Zahra Ashktorab and Jessica Vitak. 2016. Designing Cyberbullying Mitigation and Prevention Solutions through Participatory Design With Teenagers. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 3895–3905. https://doi.org/10.1145/2858036.2858548.
[7]
Tobias Bauer, Emre Devrim, Misha Glazunov, William Lopez Jaramillo, Balaganesh Mohan, and Gerasimos Spanakis. 2019. # MeTooMaastricht: Building a chatbot to assist survivors of sexual harassment. arXiv preprint arXiv:1909.02809.
[8]
Virginia Braun and Victoria Clarke. 2013. Successful qualitative research: A practical guide for beginners. sage.
[9]
Rebecca Campbell and Sheela Raja. 1999. Secondary victimization of rape victims: Insights from mental health professionals who treat survivors of violence. Violence and victims 14, 3: 261–275.
[10]
Don Soo Chon. 2014. Police reporting by sexual assault victims in Western and in non-Western countries. Journal of Family Violence 29, 8: 859–868.
[11]
Eric Corbett and Astrid Weber. 2016. What can I say?: addressing user experience challenges of a mobile voice user interface for accessibility. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, 72–82.
[12]
Kerstin Dautenhahn. 2004. Robots we like to live with⁈-a developmental perspective on a personalized, life-long robot companion. In RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759), 17–22.
[13]
Munmun De Choudhury and Sushovan De. 2014. Mental health discourse on reddit: Self-disclosure, social support, and anonymity. In Eighth international AAAI conference on weblogs and social media.
[14]
Heidi L. M. DeLoveh and Lauren Bennett Cattaneo. 2017. Deciding Where to Turn: A Qualitative Investigation of College Students’ Helpseeking Decisions After Sexual Assault. American Journal of Community Psychology 59, 1–2: 65–79. https://doi.org/10.1002/ajcp.12125.
[15]
Jill P. Dimond, Casey Fiesler, and Amy S. Bruckman. 2011. Domestic violence and information communication technologies. Interacting with Computers 23, 5: 413–421.
[16]
Ana Maria Bustamante Duarte, Nina Brendel, Auriol Degbelo, and Christian Kray. 2018. Participatory Design and Participatory Research: An HCI Case Study with Young Forced Migrants. ACM Transactions on Computer-Human Interaction 25, 1: 3:1-3:39. https://doi.org/10.1145/3145472.
[17]
Pelle Ehn. 1993. Scandinavian design: On participation and skill. Participatory design: Principles and practices 41: 77.
[18]
Kathleen Kara Fitzpatrick, Alison Darcy, and Molly Vierhile. 2017. Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Mental Health 4, 2: e19. https://doi.org/10.2196/mental.7785.
[19]
John C. Flanagan. 1954. The critical incident technique. Psychological bulletin 51, 4: 327.
[20]
Jodi Forlizzi. 2018. Moving beyond user-centered design. Interactions 25, 5: 22–23. https://doi.org/10.1145/3239558.
[21]
Diana Freed, Sam Havron, Emily Tseng, Andrea Gallardo, Rahul Chatterjee, Thomas Ristenpart, and Nicola Dell. 2019. “Is my phone hacked?” Analyzing Clinical Computer Security Interventions with Survivors of Intimate Partner Violence. Proceedings of the ACM on Human-Computer Interaction 3, CSCW: 202:1-202:24. https://doi.org/10.1145/3359304.
[22]
Diana Freed, Jackeline Palmer, Diana Elizabeth Minchala, Karen Levy, Thomas Ristenpart, and Nicola Dell. 2017. Digital Technologies and Intimate Partner Violence: A Qualitative Analysis with Multiple Stakeholders. Proceedings of the ACM on Human-Computer Interaction 1, CSCW: 46:1-46:22. https://doi.org/10.1145/3134681.
[23]
Diana Freed, Jackeline Palmer, Diana Minchala, Karen Levy, Thomas Ristenpart, and Nicola Dell. 2018. “A Stalker's Paradise”: How Intimate Partner Abusers Exploit Technology. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 1–13. https://doi.org/10.1145/3173574.3174241.
[24]
Ryan J. Gallagher, Elizabeth Stowell, Andrea G. Parker, and Brooke Foucault Welles. 2019. Reclaiming Stigmatized Narratives: The Networked Disclosure Landscape of #MeToo. Retrieved April 15, 2020 from https://doi.org/10.1145/3359198.
[25]
Claudia García-Moreno, Christina Pallitto, Karen Devries, Heidi Stöckl, Charlotte Watts, and Naeema Abrahams. 2013. Global and regional estimates of violence against women: prevalence and health effects of intimate partner violence and non-partner sexual violence. World Health Organization, Geneva, Switzerland.
[26]
Cally Gatehouse, Matthew Wood, Jo Briggs, James Pickles, and Shaun Lawson. 2018. Troubling Vulnerability: Designing with LGBT Young People's Ambivalence Towards Hate Crime Reporting. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 1–13. https://doi.org/10.1145/3173574.3173683.
[27]
Erving Goffman. 1978. The presentation of self in everyday life. Harmondsworth London.
[28]
Jonathan Grudin and Richard Jacques. 2019. Chatbots, Humbots, and the Quest for Artificial General Intelligence. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–11. https://doi.org/10.1145/3290605.3300439.
[29]
The Korea Herald. 2018. Sex crime reports rise despite decrease in overall crimes. Retrieved September 7, 2020 from http://www.koreaherald.com/view.php?ud=20180801000657.
[30]
Annabell Ho, Jeff Hancock, and Adam S. Miner. Psychological, Relational, and Emotional Effects of Self-Disclosure After Conversations With a Chatbot. Journal of Communication. https://doi.org/10.1093/joc/jqy026.
[31]
Becky Inkster, Shubhankar Sarda, and Vinod Subramanian. 2018. An Empathy-Driven, Conversational Artificial Intelligence Agent (Wysa) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study. JMIR mHealth and uHealth 6, 11: e12106. https://doi.org/10.2196/12106.
[32]
Finn Kensing and Jeanette Blomberg. 1998. Participatory design: Issues and concerns. Computer supported cooperative work (CSCW) 7, 3–4: 167–185.
[33]
Jae Yop Kim and Ji Hyeon Lee. 2011. Factors influencing help-seeking behavior among battered Korean women in intimate relationships. Journal of Interpersonal Violence 26, 15: 2991–3012.
[34]
Junhan Kim, Yoojung Kim, Byungjoon Kim, Sukyung Yun, Minjoon Kim, and Joongseek Lee. 2018. Can a Machine Tend to Teenagers’ Emotional Needs?: A Study with Conversational Agents. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA ’18), LBW018:1-LBW018:6. https://doi.org/10.1145/3170427.3188548.
[35]
Josephine Lau, Benjamin Zimmerman, and Florian Schaub. 2018. Alexa, Are You Listening?: Privacy Perceptions, Concerns and Privacy-seeking Behaviors with Smart Speakers. Proceedings of the ACM on Human-Computer Interaction 2, CSCW: 1–31. https://doi.org/10.1145/3274371.
[36]
Joohee Lee, Jinseok Kim, and Hyunsung Lim. 2010. Rape myth acceptance among Korean college students: The roles of gender, attitudes toward women, and sexual double standard. Journal of Interpersonal Violence 25, 7: 1200–1223.
[37]
Joohee Lee, Changhan Lee, and Wanhee Lee. 2012. Attitudes toward women, rape myths, and rape perceptions among male police officers in South Korea. Psychology of Women Quarterly 36, 3: 365–376.
[38]
Minha Lee, Sander Ackermans, Nena van As, Hanwen Chang, Enzo Lucas, and Wijnand IJsselsteijn. 2019. Caring for Vincent: A Chatbot for Self-Compassion. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–13. https://doi.org/10.1145/3290605.3300932.
[39]
Yi-Chieh Lee, Naomi Yamashita, Yun Huang, and Wai Fu. 2020. “ I Hear You, I Feel You”: Encouraging Deep Self-disclosure through a Chatbot. In Proceedings of the 2020 CHI conference on human factors in computing systems, 1–12.
[40]
Bingjie Liu and S. Shyam Sundar. 2018. Should Machines Express Sympathy and Empathy? Experiments with a Health Advice Chatbot. Cyberpsychology, Behavior, and Social Networking 21, 10: 625–636. https://doi.org/10.1089/cyber.2018.0110.
[41]
Katherine Lorenz, Anne Kirkner, and Sarah E. Ullman. 2019. A Qualitative Study Of Sexual Assault Survivors’ Post-Assault Legal System Experiences. Journal of Trauma & Dissociation 20, 3: 263–287. https://doi.org/10.1080/15299732.2019.1592643.
[42]
Gale M. Lucas, Jonathan Gratch, Aisha King, and Louis-Philippe Morency. 2014. It's only a computer: Virtual humans increase willingness to disclose. Computers in Human Behavior 37: 94–100. https://doi.org/10.1016/j.chb.2014.04.043.
[43]
Ewa Luger and Abigail Sellen. 2016. “Like Having a Really Bad PA”: The Gulf between User Expectation and Experience of Conversational Agents. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems - CHI ’16, 5286–5297. https://doi.org/10.1145/2858036.2858288.
[44]
Xiao Ma, Jeff Hancock, and Mor Naaman. 2016. Anonymity, intimacy and self-disclosure in social media. In Proceedings of the 2016 CHI conference on human factors in computing systems, 3857–3869.
[45]
Tara Matthews, Kathleen O'Leary, Anna Turner, Manya Sleeper, Jill Palzkill Woelfer, Martin Shelton, Cori Manthorne, Elizabeth F. Churchill, and Sunny Consolvo. 2017. Stories from Survivors: Privacy & Security Practices when Coping with Intimate Partner Abuse. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), 2189–2201. https://doi.org/10.1145/3025453.3025875.
[46]
Michael J. Muller and Sarah Kuhn. 1993. Participatory design. Communications of the ACM 36, 6: 24–28.
[47]
Michelle L. Munro-Kramer, Alexandra C. Dulin, and Caroline Gaither. 2017. What survivors want: Understanding the needs of sexual assault survivors. Journal of American College Health 65, 5: 297–305. https://doi.org/10.1080/07448481.2017.1312409.
[48]
Chelsea Myers, Anushay Furqan, Jessica Nebolsky, Karina Caro, and Jichen Zhu. 2018. Patterns for How Users Overcome Obstacles in Voice User Interfaces. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 6.
[49]
Clifford Ivar Nass and Scott Brave. 2005. Wired for speech: How voice activates and advances the human-computer relationship. MIT press Cambridge, MA.
[50]
Kathleen O'Leary, Stephen M. Schueller, Jacob O. Wobbrock, and Wanda Pratt. 2018. “Suddenly, We Got to Become Therapists for Each Other”: Designing Peer Support Chats for Mental Health. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18), 331:1-331:14. https://doi.org/10.1145/3173574.3173905.
[51]
Hyanghee Park and Joonhwan Lee. 2020. Can a Conversational Agent Lower Sexual Violence Victims’ Burden of Self-Disclosure? In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 1–8.
[52]
SoHyun Park, Jeewon Choi, Sungwoo Lee, Changhoon Oh, Changdai Kim, Soohyun La, Joonhwan Lee, and Bongwon Suh. 2019. Designing a Chatbot for a Brief Motivational Interview on Stress Management: Qualitative Case Study. Journal of Medical Internet Research 21, 4: e12231. https://doi.org/10.2196/12231.
[53]
Matthew D. Pickard, Catherine A. Roster, and Yixing Chen. 2016. Revealing sensitive information in personal interviews: Is self-disclosure easier with humans or avatars and under what conditions? Computers in Human Behavior 65: 23–30. https://doi.org/10.1016/j.chb.2016.08.004.
[54]
Martin Porcheron, Joel E. Fischer, Stuart Reeves, and Sarah Sharples. 2018. Voice Interfaces in Everyday Life. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems - CHI ’18, 1–12. https://doi.org/10.1145/3173574.3174214.
[55]
Tiago Reis, Marco de Sá, and Luís Carriço. 2008. Multimodal interaction: Real context studies on mobile digital artefacts. In International Workshop on Haptic and Audio Interaction Design, 60–69.
[56]
Stefan Rennick-Egglestone, Sarah Knowles, Gill Toms, Penny Bee, Karina Lovell, and Peter Bower. 2016. Health Technologies “In the Wild”: Experiences of Engagement with Computerised CBT. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 2124–2135. https://doi.org/10.1145/2858036.2858128.
[57]
Kambiz Saffarizadeh, Maheshwar Boodraj, and Tawfiq M Alashoor. Conversational Assistants: Investigating Privacy Concerns, Trust, and Self-Disclosure. 13.
[58]
Douglas Schuler and Aki Namioka. 1993. Participatory design: Principles and practices. CRC Press.
[59]
Karen M. Staller and Debra Nelson-Gardell. 2005. “A burden in your heart”: Lessons of disclosure from female preadolescent and adolescent survivors of sexual abuse. Child Abuse & Neglect 29, 12: 1415–1432. https://doi.org/10.1016/j.chiabu.2005.06.007.
[60]
Hyewon Suh, Nina Shahriaree, Eric B. Hekler, and Julie A. Kientz. 2016. Developing and validating the user burden scale: A tool for assessing user burden in computing systems. In Proceedings of the 2016 CHI conference on human factors in computing systems, 3988–3999.
[61]
S Shyam Sundar. 2020. Rise of Machine Agency: A Framework for Studying the Psychology of Human–AI Interaction (HAII). Journal of Computer-Mediated Communication. https://doi.org/10.1093/jcmc/zmz026.
[62]
S. Shyam Sundar and Jinyoung Kim. 2019. Machine Heuristic: When We Trust Computers More Than Humans with Our Personal Information. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), 538:1-538:9. https://doi.org/10.1145/3290605.3300768.
[63]
S. Shyam Sundar and Sampada S. Marathe. 2010. Personalization versus customization: The importance of agency, privacy, and power usage. Human Communication Research 36, 3: 298–322.
[64]
Benedict Tay, Younbo Jung, and Taezoon Park. 2014. When stereotypes meet robots: The double-edge sword of robot gender and personality in human–robot interaction. Computers in Human Behavior 38: 75–84. https://doi.org/10.1016/j.chb.2014.05.014.
[65]
Sarah E. Ullman and Liana Peter-Hagene. 2014. Social reactions to sexual assault disclosure, coping, perceived control, and PTSD symptoms in sexual assault victims. Journal of Community Psychology 42, 4: 495–508. https://doi.org/10.1002/jcop.21624.
[66]
Sarah E. Ullman, Stephanie M. Townsend, Henrietta H. Filipas, and Laura L. Starzynski. 2007. Structural models of the relations of assault severity, social support, avoidance coping, self-blame, and PTSD among sexual assault survivors. Psychology of Women Quarterly 31, 1: 23–37.
[67]
Janneke M. Van der Zwaan. 2014. An empathic virtual buddy for social support.
[68]
John Vines, Roisin McNaney, Rachel Clarke, Stephen Lindsay, John McCarthy, Steve Howard, Mario Romero, and Jayne Wallace. 2013. Designing for- and with- vulnerable people. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems on - CHI EA ’13, 3231. https://doi.org/10.1145/2468356.2479654.
[69]
Hannah E. Walker, Jennifer S. Freud, Robyn A. Ellis, Shawn M. Fraine, and Laura C. Wilson. 2019. The prevalence of sexual revictimization: A meta-analytic review. Trauma, Violence, & Abuse 20, 1: 67–80.
[70]
Michael L. Walters, Dag S. Syrdal, Kerstin Dautenhahn, René te Boekhorst, and Kheng Lee Koay. 2008. Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Autonomous Robots 24, 2: 159–178. https://doi.org/10.1007/s10514-007-9058-3.
[71]
Pontus Wärnestål, Petra Svedberg, and Jens Nygren. 2014. Co-constructing child personas for health-promoting services with vulnerable children. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’14), 3767–3776. https://doi.org/10.1145/2556288.2557115.
[72]
Geoff White. 2018. Child advice chatbots fail sex abuse test. BBC News. Retrieved October 13, 2019 from https://www.bbc.com/news/technology-46507900.
[73]
Julie R. Williamson. 2012. User experience, performance, and social acceptability: usable multimodal mobile interaction. University of Glasgow.
[74]
Matthew Wood, Andrew Garbett, Kellie Morrissey, Peter Hopkins, and Madeline Balaam. 2018. “ Protection on that Erection?” Discourses of Accountability & Compromising Participation in Digital Sexual Health. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 1–12.
[75]
Xi Yang, Marco Aurisicchio, and Weston Baxter. 2019. Understanding Affective Experiences with Conversational Agents. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19), 542:1-542:12. https://doi.org/10.1145/3290605.3300772.
[76]
Jason C. Yip, Kiley Sobel, Xin Gao, Allison Marie Hishikawa, Alexis Lim, Laura Meng, Romaine Flor Ofiana, Justin Park, and Alexis Hiniker. 2019. Laughing is Scary, but Farting is Cute: A Conceptual Model of Children's Perspectives of Creepy Technologies. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems - CHI ’19, 1–15. https://doi.org/10.1145/3290605.3300303.
[77]
Daisy Yoo, Alina Huldtgren, Jill Palzkill Woelfer, David G. Hendry, and Batya Friedman. 2013. A value sensitive action-reflection model: evolving a co-design space with stakeholder and designer prompts. In Proceedings of the SIGCHI conference on human factors in computing systems, 419–428.
[78]
John Zimmerman, Jodi Forlizzi, and Shelley Evenson. 2007. Research through design as a method for interaction design research in HCI. In Proceedings of the SIGCHI conference on Human factors in computing systems, 493–502.
[79]
WHO | Understanding and addressing violence against women. WHO. Retrieved April 18, 2020 from https://www.who.int/reproductivehealth/topics/violence/vaw_series/en/.
[80]
Brand South Africa on gender based-violence | South African Government. Retrieved January 12, 2021 from https://www.gov.za/speeches/brand-south-africa-joins-call-oppose-gender-based-violence-9-aug-2018-0000?gclid=EAIaIQobChMI3Pfu67fw6wIVRs3tCh2vtAfgEAAYAiAAEgJJifD_BwE.
[81]
Facts and figures: Ending violence against women. UN Women. Retrieved April 18, 2020 from https://www.unwomen.org/en/what-we-do/ending-violence-against-women/facts-and-figures.
[82]
South Korea 2020 Crime & Safety Report. Retrieved September 7, 2020 from https://www.osac.gov/Country/SouthKorea/Content/Detail/Report/55d33eb7-21b6-4b2f-8ccb-18618d06536b.
[83]
#MeToo chatbot, built by AI academics, could lend a non-judgmental ear to sex harassment and assault victims • The Register. Retrieved December 23, 2020 from https://www.theregister.com/2019/09/11/ai_harassment_help_chatbot/.
[84]
Companion Chat. anjali chandrashekar. Retrieved October 13, 2019 from http://www.anjalic.com/companion-chat.

Cited By

View all
  • (2024)Integrating AI in Psychotherapy: An Investigation of Trust in Voicebot TherapistsProceedings of the 13th Nordic Conference on Human-Computer Interaction10.1145/3679318.3685353(1-9)Online publication date: 13-Oct-2024
  • (2024)“I Don't Want to be Pitied by a Bot”: Understand How to Design Chatbots to Support People Being Ghosted on Dating ApplicationsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650957(1-8)Online publication date: 11-May-2024
  • (2024)Re-examining User Burden in Human-AI Interaction: Focusing on a Domain-Specific ApproachExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3638186(1-4)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. Designing a Conversational Agent for Sexual Assault Survivors: Defining Burden of Self-Disclosure and Envisioning Survivor-Centered Solutions
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
      May 2021
      10862 pages
      ISBN:9781450380966
      DOI:10.1145/3411764
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 07 May 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Chatbot
      2. Conversational AI
      3. Conversational Agent (CA)
      4. Self-Disclosure Burden
      5. Sexual Assault Survivors

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Conference

      CHI '21
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)205
      • Downloads (Last 6 weeks)21
      Reflects downloads up to 08 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Integrating AI in Psychotherapy: An Investigation of Trust in Voicebot TherapistsProceedings of the 13th Nordic Conference on Human-Computer Interaction10.1145/3679318.3685353(1-9)Online publication date: 13-Oct-2024
      • (2024)“I Don't Want to be Pitied by a Bot”: Understand How to Design Chatbots to Support People Being Ghosted on Dating ApplicationsExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650957(1-8)Online publication date: 11-May-2024
      • (2024)Re-examining User Burden in Human-AI Interaction: Focusing on a Domain-Specific ApproachExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3638186(1-4)Online publication date: 11-May-2024
      • (2024)The Promise and Peril of ChatGPT in Higher Education: Opportunities, Challenges, and Design ImplicationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642785(1-21)Online publication date: 11-May-2024
      • (2024)Understanding the Impact of Long-Term Memory on Self-Disclosure with Large Language Model-Driven Chatbots for Public Health InterventionProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642420(1-21)Online publication date: 11-May-2024
      • (2024)“It’s Not What We Were Trying to Get At, but I Think Maybe It Should Be”: Learning How to Do Trauma-Informed Design with a Data Donation Platform for Online Dating Sexual ViolenceProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642045(1-15)Online publication date: 11-May-2024
      • (2024)A Bi-Lingual Counselling Chatbot Application for Support of Gender Based Violence Victims in Kenya2024 5th International Conference on Smart Sensors and Application (ICSSA)10.1109/ICSSA62312.2024.10788649(1-6)Online publication date: 10-Sep-2024
      • (2024)AyudaMujer: A Mobile Application for the Treatment of Violence Against Women in PeruJournal of Technology in Human Services10.1080/15228835.2024.234369442:2(85-103)Online publication date: 23-Apr-2024
      • (2024)Informing the design of question-asking conversational agents for reflectionPersonal and Ubiquitous Computing10.1007/s00779-024-01831-728:6(1001-1019)Online publication date: 1-Dec-2024
      • (2023)On the Potential of Mediation Chatbots for Mitigating Multiparty Privacy Conflicts - A Wizard-of-Oz StudyProceedings of the ACM on Human-Computer Interaction10.1145/35796187:CSCW1(1-33)Online publication date: 16-Apr-2023
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media