Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Explicit Utilization of General Knowledge in Machine Reading Comprehension

About

To bridge the gap between Machine Reading Comprehension (MRC) models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, in this paper, we explore how to integrate the neural networks of MRC models with the general knowledge of human beings. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose an end-to-end MRC model named as Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. Based on the data enrichment method, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. When only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise.

Chao Wang, Hui Jiang• 2018

Related benchmarks

TaskDatasetResultRank
Machine Reading ComprehensionSQuAD 1.1 (dev)
EM76.7
48
Machine Reading ComprehensionSQuAD 1.1 (test)
EM76.1
46
Machine Reading ComprehensionAddSent (adversarial)
F1 Score60.1
6
Machine Reading ComprehensionAddOneSent (adversarial)
F1 Score72.3
6
Showing 4 of 4 rows

Other info

Follow for update