Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Continual Prompt Tuning for Dialog State Tracking

Qi Zhu, Bing Li, Fei Mi, Xiaoyan Zhu, Minlie Huang


Abstract
A desirable dialog system should be able to continually learn new skills without forgetting old ones, and thereby adapt to new domains or tasks in its life cycle. However, continually training a model often leads to a well-known catastrophic forgetting issue. In this paper, we present Continual Prompt Tuning, a parameter-efficient framework that not only avoids forgetting but also enables knowledge transfer between tasks. To avoid forgetting, we only learn and store a few prompt tokens’ embeddings for each task while freezing the backbone pre-trained model. To achieve bi-directional knowledge transfer among tasks, we propose several techniques (continual prompt initialization, query fusion, and memory replay) to transfer knowledge from preceding tasks and a memory-guided technique to transfer knowledge from subsequent tasks. Extensive experiments demonstrate the effectiveness and efficiency of our proposed method on continual learning for dialog state tracking, compared with state-of-the-art baselines.
Anthology ID:
2022.acl-long.80
Original:
2022.acl-long.80v1
Version 2:
2022.acl-long.80v2
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1124–1137
Language:
URL:
https://aclanthology.org/2022.acl-long.80
DOI:
10.18653/v1/2022.acl-long.80
Bibkey:
Cite (ACL):
Qi Zhu, Bing Li, Fei Mi, Xiaoyan Zhu, and Minlie Huang. 2022. Continual Prompt Tuning for Dialog State Tracking. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1124–1137, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Continual Prompt Tuning for Dialog State Tracking (Zhu et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.80.pdf
Software:
 2022.acl-long.80.software.zip
Code
 thu-coai/cpt4dst