Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1007/978-981-99-7254-8_55guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Knowledge-Grounded Dialogue Generation with Contrastive Knowledge Selection

Published: 25 October 2023 Publication History

Abstract

Knowledge selection is the key component in knowledge-ground dialogues, which aims to choice correct knowledge based on external knowledge for dialogue generation. The quality of knowledge selection depend on knowledge representation methods. However, the knowledge representation exploration is still challenging. We propose a knowledge-grounded dialogue model, which incorporates a knowledge-grounded module and a dialogue generation module, aiming to choose the most appropriate knowledge and fuse it into response generation. In addition, supervised contrastive knowledge representation signal is designed to obtain knowledge representation dynamically. Experiments on FoCus dataset show that our model outperforms the baseline models. The ablation study further demonstrates the effectiveness of each sub-modules.

References

[1]
Zhang, S., et al.: Personalizing dialogue agents: i have a dog, do you have pets too?. In: ACL (Volume 1: Long Papers) (2018)
[2]
Kim, M., et al.: Dual task framework for improving persona-grounded dialogue dataset. In: AAAI, vol. 36, no. 10 (2022)
[3]
Brown, T., et al.: Language models are few-shot learners. In: NeurIPS, vol. 33, pp. 1877–1901 (2020)
[4]
Raffel C et al. Exploring the limits of transfer learning with a unified text-to-text transformer J. Mach. Learn. Res. 2020 21 1 5485-5551
[5]
Wang, P., et al.: OFA: unifying architectures, tasks, and modalities through a simple sequence-to-sequence learning framework. In: International Conference on Machine Learning. PMLR (2022)
[6]
Dinan, E., et al.: Wizard of Wikipedia: knowledge-powered conversational agents. In: ICLR (2018)
[7]
Smith, E.M., et al.: Can you put it all together: evaluating conversational agents’ ability to blend skills. In: ACL (2020)
[8]
Jang, Y., et al.: Call for customized conversation: customized conversation grounding persona and knowledge. In: AAAI, vol. 36, no. 10 (2022)
[9]
Saha, S., Das, S., Srihari, R.K.: Proto-gen: an end-to-end neural generator for persona and knowledge grounded response generation. In: Proceedings of the 1st Workshop on Customized Chat Grounding Persona and Knowledge (2022)
[10]
Lee, Y.-J., et al.: PERSONACHATGEN: generating Personalized Dialogues using GPT-3. In: Proceedings of the 1st Workshop on Customized Chat Grounding Persona and Knowledge (2022)
[11]
Bak, J., Oh, A.: Variational hierarchical user-based conversation model. In: EMNLP-IJCNLP (2019)
[12]
Nakamura, K., et al.: HybriDialogue: an information-seeking dialogue dataset grounded on tabular and textual data. In: Findings of the Association for Computational Linguistics: ACL (2022)
[13]
Xu, F., et al.: Diverse dialogue generation by fusing mutual persona-aware and self-transferrer. In: APIN, pp. 1–14 (2022)
[14]
Wang, Y., et al.: Improving persona understanding for persona-based dialogue generation with diverse knowledge selection. In: ICPR. IEEE Computer Society (2022)
[15]
Smith, E.M., et al.: Can you put it all together: evaluating conversational agents’ ability to blend skills. In: ACL (2020)
[16]
Chen, T., et al.: A simple framework for contrastive learning of visual representations. In: ICML. PMLR (2020)
[17]
Khosla, P., et al.: Supervised contrastive learning. In: NeurIPS, pp. 18661–18673 (2020)
[18]
Gunel, B., et al.: Supervised contrastive learning for pre-trained language model fine-tuning. In: ICLR (2020)
[19]
An, C., et al.: CoNT: contrastive neural text generation. In: NeurIPS (2022)
[20]
Salton G and Buckley C Term-weighting approaches in automatic text retrieval IPM 1988 24 5 513-523
[21]
Bahdanau, D., Cho, K.H., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR (2015)
[22]
Zhang Yu and Yang Q A survey on multi-task learning IEEE Trans. Knowl. Data Eng. 2021 34 12 5586-5609
[23]
Wolf, T., et al.: Transformers: State-of-the-art natural language processing. In: EMNLP: System Demonstrations (2020)
[24]
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
[25]
Popović, M.: chrF++: words helping character N-grams. In: Proceedings of the second conference on machine translation (2017)
[26]
Lin, C.-Y.: ROUGE: a package for automatic evaluation of summaries. In: Text Summarization Branches Out (2004)
[27]
Vaswani, A., et al.: Attention is all you need. In: NeurIPS (2017)

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Guide Proceedings
Web Information Systems Engineering – WISE 2023: 24th International Conference, Melbourne, VIC, Australia, October 25–27, 2023, Proceedings
Oct 2023
952 pages
ISBN:978-981-99-7253-1
DOI:10.1007/978-981-99-7254-8
  • Editors:
  • Feng Zhang,
  • Hua Wang,
  • Mahmoud Barhamgi,
  • Lu Chen,
  • Rui Zhou

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 25 October 2023

Author Tags

  1. Dialogue Generation
  2. Knowledge Selection
  3. Contrastive Knowledge Representation
  4. Knowledge-ground Dialogue
  5. Encoder-decoder

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Sep 2024

Other Metrics

Citations

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media