Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation

About

Document-level Relation Extraction (DocRE) is a more challenging task compared to its sentence-level counterpart. It aims to extract relations from multiple sentences at once. In this paper, we propose a semi-supervised framework for DocRE with three novel components. Firstly, we use an axial attention module for learning the interdependency among entity-pairs, which improves the performance on two-hop relations. Secondly, we propose an adaptive focal loss to tackle the class imbalance problem of DocRE. Lastly, we use knowledge distillation to overcome the differences between human annotated data and distantly supervised data. We conducted experiments on two DocRE datasets. Our model consistently outperforms strong baselines and its performance exceeds the previous SOTA by 1.36 F1 and 1.46 Ign_F1 score on the DocRED leaderboard. Our code and data will be released at https://github.com/tonytan48/KD-DocRE.

Qingyu Tan, Ruidan He, Lidong Bing, Hwee Tou Ng• 2022

Related benchmarks

TaskDatasetResultRank
Document-level Relation ExtractionDocRED (dev)
F1 Score62.03
231
Document-level Relation ExtractionDocRED (test)
F1 Score64.03
179
Relation ExtractionDocRED v1 (test)
F167.28
66
Relation ExtractionDocRED v1 (dev)
F1 Score67.12
65
Relation ExtractionRe-DocRED (test)
Ignored F180.32
56
Relation ExtractionDocRED official (test)
RE67.28
45
Document-level Relation ExtractionRe-DocRED (test)
Ignored F1 Score80.32
38
Relation ExtractionDocRED official (dev)
F1 Score67.12
38
Document-level Relation ExtractionRe-DocRED 1.0 (test)
Overall F1 Score78.28
20
Document-level Relation ExtractionRe-DocRED 1.0 (dev)
F1 Score78.51
17
Showing 10 of 15 rows

Other info

Code

Follow for update