Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision

About

Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines.Our code and models are available at: https://github.com/YukinoWan/WCL

Zhen Wan, Fei Cheng, Qianying Liu, Zhuoyuan Mao, Haiyue Song, Sadao Kurohashi• 2022

Related benchmarks

TaskDatasetResultRank
Relation ExtractionTACRED (test)--
194
Relation ExtractionDocRED (test)--
121
Relation ExtractionDocRED human-annotated (test)
Micro F158.5
36
Sentence-level Relation ExtractionSemEval (test)
F1 (micro)0.883
24
Showing 4 of 4 rows

Other info

Follow for update