Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Bidirectional Attention Flow for Machine Comprehension

About

Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query. Recently, attention mechanisms have been successfully extended to MC. Typically these methods use attention to focus on a small portion of the context and summarize it with a fixed-size vector, couple attentions temporally, and/or often form a uni-directional attention. In this paper we introduce the Bi-Directional Attention Flow (BIDAF) network, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Our experimental evaluations show that our model achieves the state-of-the-art results in Stanford Question Answering Dataset (SQuAD) and CNN/DailyMail cloze test.

Minjoon Seo, Aniruddha Kembhavi, Ali Farhadi, Hannaneh Hajishirzi• 2016

Related benchmarks

TaskDatasetResultRank
Question AnsweringSQuAD v1.1 (dev)
F1 Score81.1
375
Question AnsweringSQuAD v1.1 (test)
F1 Score81.525
260
Question AnsweringSQuAD (test)
F180.33
111
Machine ComprehensionCNN (val)
Accuracy0.763
80
Machine ComprehensionCNN (test)
Accuracy76.9
77
Question AnsweringSQuAD (dev)
F177.3
74
Question AnsweringSQuAD v1.1 (val)
F1 Score77.3
70
Multi-hop Question AnsweringHotpotQA fullwiki setting (test)
Answer F132.89
64
Reading ComprehensionDROP (dev)
F1 Score28.85
63
Reading ComprehensionDROP (test)
F1 Score27.49
61
Showing 10 of 59 rows

Other info

Code

Follow for update