Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Contextualized Word Representations for Reading Comprehension

About

Reading a document and extracting an answer to a question about its content has attracted substantial attention recently. While most work has focused on the interaction between the question and the document, in this work we evaluate the importance of context when the question and document are processed independently. We take a standard neural architecture for this task, and show that by providing rich contextualized word representations from a large pre-trained language model as well as allowing the model to choose between context-dependent and context-independent word representations, we can obtain dramatic improvements and reach performance comparable to state-of-the-art on the competitive SQuAD dataset.

Shimi Salant, Jonathan Berant• 2017

Related benchmarks

TaskDatasetResultRank
Question AnsweringSQuAD v1.1 (test)
F1 Score84.2
260
Machine Reading ComprehensionSQuAD 1.1 (dev)
EM77
48
Machine Reading ComprehensionSQuAD 1.1 (test)
EM77.6
46
Machine Reading ComprehensionAddSent (adversarial)
F1 Score47
6
Machine Reading ComprehensionAddOneSent (adversarial)
F1 Score57
6
Showing 5 of 5 rows

Other info

Code

Follow for update