Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Iterative Alternating Neural Attention for Machine Reading

About

We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset.

Alessandro Sordoni, Philip Bachman, Adam Trischler, Yoshua Bengio• 2016

Related benchmarks

TaskDatasetResultRank
Machine ComprehensionCNN (val)
Accuracy0.752
80
Machine ComprehensionCNN (test)
Accuracy76.1
77
Machine ComprehensionCBT-CN (test)
Accuracy71
56
Machine ComprehensionCBT NE (test)
Accuracy72
56
Machine ComprehensionCBT-CN (val)
Accuracy74.1
37
Machine ComprehensionCBT-NE (val)
Accuracy76.9
37
Question AnsweringCNN (test)
Accuracy75.7
24
Reading ComprehensionChildren's Book Test (CBT) Common Noun (CN) (dev)
Accuracy74.1
12
Reading ComprehensionChildren's Book Test (CBT) Named Entity (NE) (dev)
Accuracy76.9
12
Showing 9 of 9 rows

Other info

Follow for update