Relation Extraction with Weighted Contrastive Pre-training on Distant Supervision
About
Contrastive pre-training on distant supervision has shown remarkable effectiveness in improving supervised relation extraction tasks. However, the existing methods ignore the intrinsic noise of distant supervision during the pre-training stage. In this paper, we propose a weighted contrastive learning method by leveraging the supervised data to estimate the reliability of pre-training instances and explicitly reduce the effect of noise. Experimental results on three supervised datasets demonstrate the advantages of our proposed weighted contrastive learning approach compared to two state-of-the-art non-weighted baselines.Our code and models are available at: https://github.com/YukinoWan/WCL
Zhen Wan, Fei Cheng, Qianying Liu, Zhuoyuan Mao, Haiyue Song, Sadao Kurohashi• 2022
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Relation Extraction | TACRED (test) | -- | 194 | |
| Relation Extraction | DocRED (test) | -- | 121 | |
| Relation Extraction | DocRED human-annotated (test) | Micro F158.5 | 36 | |
| Sentence-level Relation Extraction | SemEval (test) | F1 (micro)0.883 | 24 |
Showing 4 of 4 rows