Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

A Multi-Attention based Neural Network with External Knowledge for Story Ending Predicting Task

Qian Li, Ziwei Li, Jin-Mao Wei, Yanhui Gu, Adam Jatowt, Zhenglu Yang


Abstract
Enabling a mechanism to understand a temporal story and predict its ending is an interesting issue that has attracted considerable attention, as in case of the ROC Story Cloze Task (SCT). In this paper, we develop a multi-attention-based neural network (MANN) with well-designed optimizations, like Highway Network, and concatenated features with embedding representations into the hierarchical neural network model. Considering the particulars of the specific task, we thoughtfully extend MANN with external knowledge resources, exceeding state-of-the-art results obviously. Furthermore, we develop a thorough understanding of our model through a careful hand analysis on a subset of the stories. We identify what traits of MANN contribute to its outperformance and how external knowledge is obtained in such an ending prediction task.
Anthology ID:
C18-1149
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1754–1762
Language:
URL:
https://aclanthology.org/C18-1149
DOI:
Bibkey:
Cite (ACL):
Qian Li, Ziwei Li, Jin-Mao Wei, Yanhui Gu, Adam Jatowt, and Zhenglu Yang. 2018. A Multi-Attention based Neural Network with External Knowledge for Story Ending Predicting Task. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1754–1762, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
A Multi-Attention based Neural Network with External Knowledge for Story Ending Predicting Task (Li et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1149.pdf
Data
StoryCloze