Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks

About

Multi-hop reading comprehension focuses on one type of factoid question, where a system needs to properly integrate multiple pieces of evidence to correctly answer a question. Previous work approximates global evidence with local coreference information, encoding coreference chains with DAG-styled GRU layers within a gated-attention reader. However, coreference is limited in providing information for rich inference. We introduce a new method for better connecting global evidence, which forms more complex graphs compared to DAGs. To perform evidence integration on our graphs, we investigate two recent graph neural networks, namely graph convolutional network (GCN) and graph recurrent network (GRN). Experiments on two standard datasets show that richer global information leads to better answers. Our method performs better than all published results on these datasets.

Linfeng Song, Zhiguo Wang, Mo Yu, Yue Zhang, Radu Florian, Daniel Gildea• 2018

Related benchmarks

TaskDatasetResultRank
Multi-hop Reading ComprehensionWikiHop unmasked (dev)
Accuracy62.8
11
Multi-hop Reading ComprehensionWikiHop unmasked (test)
Accuracy65.4
9
Multi-hop Question AnsweringComplexWebQuestions v1.1 (dev)
Accuracy33.2
7
Multi-hop Question AnsweringComplexWebQuestions 1.1 (test)
Accuracy30.1
3
Showing 4 of 4 rows

Other info

Follow for update