Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Attention-based Embeddings for Relation Prediction in Knowledge Graphs

About

The recent proliferation of knowledge graphs (KGs) coupled with incomplete or partial information, in the form of missing relations (links) between entities, has fueled a lot of research on knowledge base completion (also known as relation prediction). Several recent works suggest that convolutional neural network (CNN) based models generate richer and more expressive feature embeddings and hence also perform well on relation prediction. However, we observe that these KG embeddings treat triples independently and thus fail to cover the complex and hidden information that is inherently implicit in the local neighborhood surrounding a triple. To this effect, our paper proposes a novel attention based feature embedding that captures both entity and relation features in any given entity's neighborhood. Additionally, we also encapsulate relation clusters and multihop relations in our model. Our empirical study offers insights into the efficacy of our attention based model and we show marked performance gains in comparison to state of the art methods on all datasets.

Deepak Nathani, Jatin Chauhan, Charu Sharma, Manohar Kaul• 2019

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)
Hits@1062.6
419
Link PredictionWN18RR (test)
Hits@1058.1
380
Link PredictionFB15k-237
MRR15.7
280
Knowledge Graph CompletionFB15k-237 (test)
MRR0.35
179
Knowledge Graph CompletionWN18RR (test)
MRR0.464
177
Link PredictionWN18RR
Hits@1058.1
175
Link PredictionKinship (test)
MRR0.904
11
Relation PredictionNELL-995 (test)
Mean Rank (MR)965
7
Knowledge Graph CompletionNELL-995 (test)
MRR37.4
5
Showing 9 of 9 rows

Other info

Follow for update