KG-BERT: BERT for Knowledge Graph Completion
About
Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Link Prediction | FB15k-237 (test) | Hits@1042 | 419 | |
| Link Prediction | WN18RR (test) | Hits@1052.4 | 380 | |
| Link Prediction | FB15k-237 | MRR24.5 | 280 | |
| Link Prediction | WN18RR | Hits@1052.4 | 175 | |
| Knowledge Graph Completion | WN18RR | Hits@14.1 | 165 | |
| Link Prediction | FB15K (test) | -- | 164 | |
| Knowledge Graph Completion | FB15k-237 | Hits@100.42 | 108 | |
| Link Prediction | FB15k-237 filtered (test) | Hits@100.42 | 60 | |
| Link Prediction | WN18RR filtered (test) | Hits@100.524 | 57 | |
| Link Prediction | UMLS | Hits@1099 | 56 |