Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings

About

Previous knowledge graph embedding approaches usually map entities to representations and utilize score functions to predict the target entities, yet they typically struggle to reason rare or emerging unseen entities. In this paper, we propose kNN-KGE, a new knowledge graph embedding approach with pre-trained language models, by linearly interpolating its entity distribution with k-nearest neighbors. We compute the nearest neighbors based on the distance in the entity embedding space from the knowledge store. Our approach can allow rare or emerging entities to be memorized explicitly rather than implicitly in model parameters. Experimental results demonstrate that our approach can improve inductive and transductive link prediction results and yield better performance for low-resource settings with only a few triples, which might be easier to reason via explicit memory. Code is available at https://github.com/zjunlp/KNN-KG.

Peng Wang, Xin Xie, Xiaohan Wang, Ningyu Zhang• 2022

Related benchmarks

TaskDatasetResultRank
Knowledge Graph CompletionWN18RR (test)
MRR0.579
177
Inductive Link PredictionFB15k-237 inductive (test)
Hits@100.293
37
Link PredictionWN18RR transductive (test)
MRR0.579
30
Inductive Link PredictionWN18RR inductive (test)
MRR0.294
30
Link PredictionFB15K-237 transductive (test)
Hits@1055
16
Showing 5 of 5 rows

Other info

Code

Follow for update