Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation

Beomsu Kim, Seokjun Seo, Seungju Han, Enkhbayar Erdenee, Buru Chang


Abstract
Despite the remarkable performance of large-scale generative models in open-domain conversation, they are known to be less practical for building real-time conversation systems due to high latency. On the other hand, retrieval models could return responses with much lower latency but show inferior performance to the large-scale generative models since the conversation quality is bounded by the pre-defined response set. To take advantage of both approaches, we propose a new training method called G2R (Generative-to-Retrieval distillation) that preserves the efficiency of a retrieval model while leveraging the conversational ability of a large-scale generative model by infusing the knowledge of the generative model into the retrieval model. G2R consists of two distinct techniques of distillation: the data-level G2R augments the dialogue dataset with additional responses generated by the large-scale generative model, and the model-level G2R transfers the response quality score assessed by the generative model to the score of the retrieval model by the knowledge distillation loss. Through extensive experiments including human evaluation, we demonstrate that our retrieval-based conversation system trained with G2R shows a substantially improved performance compared to the baseline retrieval model while showing significantly lower inference latency than the large-scale generative models.
Anthology ID:
2021.findings-emnlp.286
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3357–3373
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.286
DOI:
10.18653/v1/2021.findings-emnlp.286
Bibkey:
Cite (ACL):
Beomsu Kim, Seokjun Seo, Seungju Han, Enkhbayar Erdenee, and Buru Chang. 2021. Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3357–3373, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation (Kim et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.286.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.286.mp4
Code
 hyperconnect/g2r