Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Enriching Pre-trained Language Model with Entity Information for Relation Classification

About

Relation classification is an important NLP task to extract relations between entities. The state-of-the-art methods for relation classification are primarily based on Convolutional or Recurrent Neural Networks. Recently, the pre-trained BERT model achieves very successful results in many NLP classification / sequence labeling tasks. Relation classification differs from those tasks in that it relies on information of both the sentence and the two target entities. In this paper, we propose a model that both leverages the pre-trained BERT language model and incorporates information from the target entities to tackle the relation classification task. We locate the target entities and transfer the information through the pre-trained architecture and incorporate the corresponding encoding of the two entities. We achieve significant improvement over the state-of-the-art method on the SemEval-2010 task 8 relational dataset.

Shanchan Wu, Yifan He• 2019

Related benchmarks

TaskDatasetResultRank
Relation ClassificationSemEval-2010 Task 8 (test)
F1 Score89.25
128
Relationship ExtractionSemEval Task 8 2010 (test)
F1 Score89.25
24
Relation ClassificationWiki-ZSL (test)
Precision (%)39.22
22
Relation ClassificationFewRel (test)
Precision0.4219
22
Relation ExtractionTACRED v1.0 (full)
Micro F169.1
16
Relation ExtractionSemEval-2010 Task 8 (test)
Macro F189.3
8
Zero-shot Relation ExtractionWiki ZSL m=5 (test)
Precision39.22
7
Zero-shot Relation ExtractionWiki-ZSL m=10 (test)
Precision26.18
7
Zero-shot Relation ExtractionWiki-ZSL m=15 (test)
Precision (%)17.31
7
Zero-shot Relation ExtractionFewRel m=5 (test)
Precision42.19
7
Showing 10 of 12 rows

Other info

Follow for update