Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

KG-BERT: BERT for Knowledge Graph Completion

About

Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.

Liang Yao, Chengsheng Mao, Yuan Luo• 2019

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)
Hits@1042
419
Link PredictionWN18RR (test)
Hits@1052.4
380
Link PredictionFB15k-237
MRR24.5
280
Link PredictionWN18RR
Hits@1052.4
175
Knowledge Graph CompletionWN18RR
Hits@14.1
165
Link PredictionFB15K (test)--
164
Knowledge Graph CompletionFB15k-237
Hits@100.42
108
Link PredictionFB15k-237 filtered (test)
Hits@100.42
60
Link PredictionWN18RR filtered (test)
Hits@100.524
57
Link PredictionUMLS
Hits@1099
56
Showing 10 of 43 rows

Other info

Code

Follow for update