Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Entity Structure Within and Throughout: Modeling Mention Dependencies for Document-Level Relation Extraction

About

Entities, as the essential elements in relation extraction tasks, exhibit certain structure. In this work, we formulate such structure as distinctive dependencies between mention pairs. We then propose SSAN, which incorporates these structural dependencies within the standard self-attention mechanism and throughout the overall encoding stage. Specifically, we design two alternative transformation modules inside each self-attention building block to produce attentive biases so as to adaptively regularize its attention flow. Our experiments demonstrate the usefulness of the proposed entity structure and the effectiveness of SSAN. It significantly outperforms competitive baselines, achieving new state-of-the-art results on three popular document-level relation extraction datasets. We further provide ablation and visualization to show how the entity structure guides the model for better relation extraction. Our code is publicly available.

Benfeng Xu, Quan Wang, Yajuan Lyu, Yong Zhu, Zhendong Mao• 2021

Related benchmarks

TaskDatasetResultRank
Document-level Relation ExtractionDocRED (dev)
F1 Score60.18
231
Document-level Relation ExtractionDocRED (test)
F1 Score59.45
179
Relation ExtractionDocRED (test)
F1 Score65.92
121
Relation ExtractionDocRED (dev)
F1 Score65.69
98
Relation ExtractionCDR (test)
F1 Score68.7
92
Relation ExtractionDocRED v1 (test)
F165.92
66
Relation ExtractionDocRED v1 (dev)
F1 Score65.69
65
Relation ExtractionGDA (test)
F1 Score83.9
65
Document-level Relation ExtractionDocRED 1.0 (test)
F161.42
51
Relation ExtractionDocRED official (test)
RE65.92
45
Showing 10 of 21 rows

Other info

Code

Follow for update