Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3570945.3607326acmconferencesArticle/Chapter ViewAbstractPublication PagesivaConference Proceedingsconference-collections
research-article

IAVA: Interactive and Adaptive Virtual Agent

Published: 22 December 2023 Publication History

Abstract

During an interaction, partners adapt their behaviors to each other. Adaptation can have several functions such as being a sign of engagement and enhancing human users' interaction experience. It is important that virtual agents acting as interaction partners should continuously adapt their behaviors to those of their interlocutors in real time. This paper focuses on creating an interactive virtual agent that is capable of rendering real-time adaptive behaviors in response to its human interlocutor. It ensures the two aspects: generating real-time adaptive behavior and managing natural dialogue. We propose a system of an adaptive virtual agent and choose the e-health application of Cognitive Behavioral Therapy (CBT), which is a mental health treatment that restructures automatic thoughts into balanced thoughts, as a proof-of-concept to showcase the benefit of endowing behavior adaptation to the agent. The virtual agent adapts to the user via the display of nonverbal behaviors, which are generated via a deep learning model, throughout the whole interaction while acting as a therapist helping human users to detect their negative automatic thoughts.

References

[1]
Keith Anderson, Elisabeth André, Tobias Baur, Sara Bernardini, Mathieu Chollet, Evi Chryssafidou, Ionut Damian, Cathy Ennis, Arjan Egges, Patrick Gebhard, et al. 2013. The TARDIS framework: intelligent virtual agents for social coaching in job interviews. In Advances in Computer Entertainment: 10th International Conference, ACE 2013, Boekelo, The Netherlands, November 12-15, 2013. Proceedings 10. Springer, 476--491.
[2]
Jeremy N Bailenson and Nick Yee. 2005. Digital chameleons: Automatic assimilation of nonverbal gestures in immersive virtual environments. Psychological science 16, 10 (2005), 814--819.
[3]
Tadas Baltrušaitis, Peter Robinson, and Louis-Philippe Morency. 2016. Openface: an open source facial behavior analysis toolkit. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 1--10.
[4]
Judith S Beck and Aaron T Beck. 2011. Cognitive behavior therapy. New York: Basics and beyond. Guilford Publication (2011), 19--20.
[5]
Alan S Bellack, Kim T Mueser, Susan Gingerich, and Julie Agresta. 2013. Social skills training for schizophrenia: A step-by-step guide. Guilford Publications.
[6]
Beatrice Biancardi, Soumia Dermouche, and Catherine Pelachaud. 2021. Adaptation Mechanisms in Human--Agent Interaction: Effects on User's Impressions and Engagement. Frontiers in Computer Science 3 (2021), 696682.
[7]
Timothy Bickmore. 2022. Health-related applications of socially interactive agents. In The Handbook on Socially Interactive Agents: 20 years of Research on Embodied Conversational Agents, Intelligent Virtual Agents, and Social Robotics Volume 2: Interactivity, Platforms, Application. 403--436.
[8]
Judee K Burgoon, Lesa AStern, and Leesa Dillman. 1995. Interpersonal adaptation: Dyadic interaction patterns. Cambridge University Press.
[9]
Angelo Cafaro, Johannes Wagner, Tobias Baur, Soumia Dermouche, Mercedes Torres Torres, Catherine Pelachaud, Elisabeth Andre, and Michel Valstar. 2017. The NoXi database: multimodal recordings of mediated novice-expert interactions. 350--359. https://doi.org/10.1145/3136755.3136780
[10]
Joseph N Cappella. 1991. Mutual adaptation and relativity of measurement. Studying interpersonal interaction 1 (1991), 103--117.
[11]
Justine Cassell, Hannes Högni Vilhjálmsson, and Timothy Bickmore. 2001. Beat: the behavior expression animation toolkit. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques. 477--486.
[12]
Emilie Delaherche, Mohamed Chetouani, Ammar Mahdhaoui, Catherine Saint-Georges, Sylvie Viaux, and David Cohen. 2012. Interpersonal synchrony: A survey of evaluation methods across disciplines. IEEE Transactions on Affective Computing 3, 3 (2012), 349--365.
[13]
Soumia Dermouche and Catherine Pelachaud. 2019. Generative model of agent's behaviors in human-agent interaction. In 2019 International Conference on Multimodal Interaction. 375--384.
[14]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
[15]
Paul Ekman and Wallace V Friesen. 1976. Measuring facial movement. Environmental psychology and nonverbal behavior 1, 1 (1976), 56--75.
[16]
Florian Eyben, Martin Wöllmer, and Björn Schuller. 2010. Opensmile: the munich versatile and fast open-source audio feature extractor. In Proceedings of the 18th ACM international conference on Multimedia. 1459--1462.
[17]
Foteini Grivokostopoulou, Konstantinos Kovas, and Isidoros Perikos. 2020. The effectiveness of embodied pedagogical agents and their impact on students learning in virtual worlds. Applied Sciences 10, 5 (2020), 1739.
[18]
Aman Gupta, Finn L Strivens, Benjamin Tag, Kai Kunze, and Jamie A Ward. 2019. Blink as you sync: Uncovering eye and nod synchrony in conversation using wearable sensing. In Proceedings of the 23rd International Symposium on Wearable Computers. 66--71.
[19]
Pieter Hintjens. 2013. ZeroMQ: messaging for many applications. "O'Reilly Media, Inc.".
[20]
Lixing Huang, Louis-Philippe Morency, and Jonathan Gratch. 2010. Learning backchannel prediction model from parasocial consensus sampling: a subjective evaluation. In Intelligent Virtual Agents: 10th International Conference, IVA 2010, Philadelphia, PA, USA, September 20-22, 2010. Proceedings 10. Springer, 159--172.
[21]
Beth Logan. 2000. Mel frequency cepstral coefficients for music modeling. In In International Symposium on Music Information Retrieval. Citeseer.
[22]
Caitlin Mills, Nigel Bosch, Kristina Krasich, and Sidney K D'Mello.2019. Reducing mind-wandering during vicarious learning from an intelligent tutoring system. In Artificial Intelligence in Education: 20th International Conference, AIED 2019, Chicago, IL, USA, June 25-29, 2019, Proceedings, Part I 20. Springer, 296--307.
[23]
Radoslaw Niewiadomski, Elisabetta Bevacqua, Maurizio Mancini, and Catherine Pelachaud. 2009. Greta: an interactive expressive ECA system. In Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems-Volume 2. 1399--1400.
[24]
Igor S Pandzic and Robert Forchheimer. 2003. MPEG-4 facial animation: the standard, implementation and applications. John Wiley & Sons.
[25]
Carlos Pereira Santos, Joey Relouw, Kevin Hutchinson-Lhuissier, Alexander van Buggenum, Agathe Boudry, Annemarie Fransen, Myrthe van der Ven, and Igor Mayer. 2023. Embodied Agents for Obstetric Simulation Training. In Proceedings of the 28th International Conference on Intelligent User Interfaces. 515--527.
[26]
Pierre Philip, Lucile Dupuy, Marc Auriacombe, Fushia Serre, Etienne de Sevin, Alain Sauteraud, and Jean-Arthur Micoulaud-Franchi. 2020. Trust and acceptance of a virtual psychiatric interview between embodied conversational agents and outpatients. NPJ digital medicine 3, 1 (2020), 2.
[27]
Stéphane Raffard, Robin N Salesse, Catherine Bortolon, Benoit G Bardy, José Henriques, Ludovic Marin, Didier Stricker, and Delphine Capdevielle. 2018. Using mimicry of body movements by a virtual agent to increase synchronization behavior and rapport in individuals with schizophrenia. Scientific reports 8, 1 (2018), 17356.
[28]
Lazlo Ring, Timothy Bickmore, and Paola Pedrelli. 2016. An affectively aware virtual therapist for depression counseling. In ACM SIGCHI Conference on Human Factors in Computing Systems (CHI) workshop on Computing and Mental Health. 01951--12.
[29]
Hannes Ritschel, Tobias Baur, and Elisabeth André. 2017. Adapting a robot's linguistic style based on socially-aware reinforcement learning. In 2017 26th ieee international symposium on robot and human interactive communication (ro-man). IEEE, 378--384.
[30]
Kazuhiro Shidara, Hiroki Tanaka, Hiroyoshi Adachi, Daisuke Kanayama, Yukako Sakagami, Takashi Kudo, and Satoshi Nakamura. 2022. Automatic thoughts and facial expressions in cognitive restructuring with virtual agents. Frontiers in Computer Science 4 (2022), 8.
[31]
Candace L Sidner, Timothy Bickmore, Bahador Nooraie, Charles Rich, Lazlo Ring, Mahni Shayganfar, and Laura Vardoulakis. 2018. Creating new technologies for companionable agents to support isolated older adults. ACM Transactions on Interactive Intelligent Systems (TiiS) 8, 3 (2018), 1--27.
[32]
Bruce Snyder, Dejan Bosnanac, and Rob Davies. 2011. ActiveMQ in action. Vol. 47. Manning Greenwich Conn.
[33]
Jelte van Waterschoot, Merijn Bruijnes, Jan Flokstra, Dennis Reidsma, Daniel Davison, Mariët Theune, and Dirk Heylen. 2018. Flipper 2.0: A pragmatic dialogue engine for embodied conversational agents. In Proceedings of the 18th International Conference on Intelligent Virtual Agents. 43--50.
[34]
Hannes Vilhjálmsson, Nathan Cantelmo, Justine Cassell, Nicolas E. Chafai, Michael Kipp, Stefan Kopp, Maurizio Mancini, Stacy Marsella, Andrew N Marshall, Catherine Pelachaud, et al. 2007. The behavior markup language: Recent developments and challenges. In Intelligent Virtual Agents: 7th International Conference, IVA 2007 Paris, France, September 17-19, 2007 Proceedings 7. Springer, 99--111.
[35]
Klaus Weber, Hannes Ritschel, Ilhan Aslan, Florian Lingenfelser, and Elisabeth André. 2018. How to shape the humor of a robot-social behavior adaptation based on reinforcement learning. In Proceedings of the 20th ACM international conference on multimodal interaction. 154--162.
[36]
Jieyeon Woo, Catherine Pelachaud, and Catherine Achard. 2023. ASAP: Endowing Adaptation Capability to Agent in Human-Agent Interaction. In 28th International Conference on Intelligent User Interfaces.
[37]
Matthew Wright. 2005. Open Sound Control: an enabling technology for musical networking. Organised Sound 10, 3 (2005), 193--200.

Cited By

View all
  • (2024)Adaptive virtual agent: Design and evaluation for real-time human-agent interactionInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103321190(103321)Online publication date: Oct-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
IVA '23: Proceedings of the 23rd ACM International Conference on Intelligent Virtual Agents
September 2023
376 pages
ISBN:9781450399944
DOI:10.1145/3570945
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 December 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Virtual agent
  2. adaptation
  3. real-time system

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • ANR-JST-CREST TAPAS
  • IA ANR-DFG-JST Panorama

Conference

IVA '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 53 of 196 submissions, 27%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)90
  • Downloads (Last 6 weeks)9
Reflects downloads up to 24 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Adaptive virtual agent: Design and evaluation for real-time human-agent interactionInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103321190(103321)Online publication date: Oct-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media