Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Generating Highly Relevant Questions

Jiazuo Qiu, Deyi Xiong


Abstract
The neural seq2seq based question generation (QG) is prone to generating generic and undiversified questions that are poorly relevant to the given passage and target answer. In this paper, we propose two methods to address the issue. (1) By a partial copy mechanism, we prioritize words that are morphologically close to words in the input passage when generating questions; (2) By a QA-based reranker, from the n-best list of question candidates, we select questions that are preferred by both the QA and QG model. Experiments and analyses demonstrate that the proposed two methods substantially improve the relevance of generated questions to passages and answers.
Anthology ID:
D19-1614
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
5983–5987
Language:
URL:
https://aclanthology.org/D19-1614
DOI:
10.18653/v1/D19-1614
Bibkey:
Cite (ACL):
Jiazuo Qiu and Deyi Xiong. 2019. Generating Highly Relevant Questions. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 5983–5987, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Generating Highly Relevant Questions (Qiu & Xiong, EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1614.pdf
Data
SQuAD