Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Natural Answer Generation with Heterogeneous Memory

Yao Fu, Yansong Feng


Abstract
Memory augmented encoder-decoder framework has achieved promising progress for natural language generation tasks. Such frameworks enable a decoder to retrieve from a memory during generation. However, less research has been done to take care of the memory contents from different sources, which are often of heterogeneous formats. In this work, we propose a novel attention mechanism to encourage the decoder to actively interact with the memory by taking its heterogeneity into account. Our solution attends across the generated history and memory to explicitly avoid repetition, and introduce related knowledge to enrich our generated sentences. Experiments on the answer sentence generation task show that our method can effectively explore heterogeneous memory to produce readable and meaningful answer sentences while maintaining high coverage for given answer information.
Anthology ID:
N18-1017
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
185–195
Language:
URL:
https://aclanthology.org/N18-1017
DOI:
10.18653/v1/N18-1017
Bibkey:
Cite (ACL):
Yao Fu and Yansong Feng. 2018. Natural Answer Generation with Heterogeneous Memory. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 185–195, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Natural Answer Generation with Heterogeneous Memory (Fu & Feng, NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1017.pdf
Data
WikiMovies