Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Orthogonal Relation Transforms with Graph Context Modeling for Knowledge Graph Embedding

About

Translational distance-based knowledge graph embedding has shown progressive improvements on the link prediction task, from TransE to the latest state-of-the-art RotatE. However, N-1, 1-N and N-N predictions still remain challenging. In this work, we propose a novel translational distance-based approach for knowledge graph link prediction. The proposed method includes two-folds, first we extend the RotatE from 2D complex domain to high dimension space with orthogonal transforms to model relations for better modeling capacity. Second, the graph context is explicitly modeled via two directed context representations. These context representations are used as part of the distance scoring function to measure the plausibility of the triples during training and inference. The proposed approach effectively improves prediction accuracy on the difficult N-1, 1-N and N-N cases for knowledge graph link prediction task. The experimental results show that it achieves better performance on two benchmark data sets compared to the baseline RotatE, especially on data set (FB15k-237) with many high in-degree connection nodes.

Yun Tang, Jing Huang, Guangtao Wang, Xiaodong He, Bowen Zhou• 2019

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)
Hits@1055
419
Link PredictionWN18RR (test)
Hits@1058.7
380
Link PredictionFB15k-237
MRR36.1
280
Link PredictionWN18RR
Hits@1058.3
175
Showing 4 of 4 rows

Other info

Follow for update