Iterative alternating neural attention for machine reading
A Sordoni, P Bachman, A Trischler… - arXiv preprint arXiv …, 2016 - arxiv.org
arXiv preprint arXiv:1606.02245, 2016•arxiv.org
We propose a novel neural attention architecture to tackle machine comprehension tasks,
such as answering Cloze-style queries with respect to a document. Unlike previous models,
we do not collapse the query into a single vector, instead we deploy an iterative alternating
attention mechanism that allows a fine-grained exploration of both the query and the
document. Our model outperforms state-of-the-art baselines in standard machine
comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) …
such as answering Cloze-style queries with respect to a document. Unlike previous models,
we do not collapse the query into a single vector, instead we deploy an iterative alternating
attention mechanism that allows a fine-grained exploration of both the query and the
document. Our model outperforms state-of-the-art baselines in standard machine
comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) …
We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset.
arxiv.org