Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Global-Local Modeling with Prompt-Based Knowledge Enhancement for Emotion Inference in Conversation

Renxi Wang, Shi Feng


Abstract
The ability to recognize emotions in conversations is necessary and important for the online chatbot to do tasks such as empathetic response generation and emotional support. Present researches mainly focus on recognizing emotions through a speaker’s utterance, while research on emotion inference predicts emotions of addressees through previous utterances. Because of the lack of the addressee’s utterance, emotion inference is more challenging than emotion recognition. In this paper, we propose a global-local modeling method based on recurrent neural networks (RNN) and pre-trained language models (PLM) to do emotion inference, which utilizes the sequence modeling ability of RNNs and abundant knowledge from PLMs. Moreover, we take the whole dialogue history as input of PLM to generate knowledge by in-context learning. Experimental results show that our model with knoledge enhancement achieves state-of-the-art performance on all three datasets.
Anthology ID:
2023.findings-eacl.158
Volume:
Findings of the Association for Computational Linguistics: EACL 2023
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2120–2127
Language:
URL:
https://aclanthology.org/2023.findings-eacl.158/
DOI:
10.18653/v1/2023.findings-eacl.158
Bibkey:
Cite (ACL):
Renxi Wang and Shi Feng. 2023. Global-Local Modeling with Prompt-Based Knowledge Enhancement for Emotion Inference in Conversation. In Findings of the Association for Computational Linguistics: EACL 2023, pages 2120–2127, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Global-Local Modeling with Prompt-Based Knowledge Enhancement for Emotion Inference in Conversation (Wang & Feng, Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-eacl.158.pdf
Dataset:
 2023.findings-eacl.158.dataset.zip
Video:
 https://aclanthology.org/2023.findings-eacl.158.mp4