Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

About

Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.

Yuhao Zhang, Peng Qi, Christopher D. Manning• 2018

Related benchmarks

TaskDatasetResultRank
Relation ExtractionTACRED (test)
F1 Score68.2
194
Relation ClassificationSemEval-2010 Task 8 (test)
F1 Score84.8
128
Relation ExtractionDocRED (test)
F1 Score54.6
121
Relation ExtractionDocRED (dev)
F1 Score52.47
98
Relation ExtractionTACRED
Micro F166.4
97
Relation ExtractionTACRED v1.0 (test)
F1 Score80.3
37
Relationship ExtractionSemEval Task 8 2010 (test)
F1 Score84.8
24
Relation ExtractionTACRED Original (test)
F1 Score66.7
23
Binary-class n-ary relation extractionPubMed n-ary relation extraction
Accuracy (B-Cross)83.7
12
Relation ExtractionTACRED Original (dev)
F1 Score67.2
12
Showing 10 of 15 rows

Other info

Code

Follow for update