Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Relphormer: Relational Graph Transformer for Knowledge Graph Representations

About

Transformers have achieved remarkable performance in widespread fields, including natural language processing, computer vision and graph mining. However, vanilla Transformer architectures have not yielded promising improvements in the Knowledge Graph (KG) representations, where the translational distance paradigm dominates this area. Note that vanilla Transformer architectures struggle to capture the intrinsically heterogeneous structural and semantic information of knowledge graphs. To this end, we propose a new variant of Transformer for knowledge graph representations dubbed Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample contextualized sub-graph sequences as the input to alleviate the heterogeneity issue. We propose a novel structure-enhanced self-attention mechanism to encode the relational information and keep the semantic information within entities and relations. Moreover, we utilize masked knowledge modeling for general knowledge graph representation learning, which can be applied to various KG-based tasks including knowledge graph completion, question answering, and recommendation. Experimental results on six datasets show that Relphormer can obtain better performance compared with baselines. Code is available in https://github.com/zjunlp/Relphormer.

Zhen Bi, Siyuan Cheng, Jing Chen, Xiaozhuan Liang, Feiyu Xiong, Ningyu Zhang• 2022

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)--
419
Link PredictionWN18RR (test)--
380
Link PredictionFB15k-237
MRR37.1
280
Link PredictionWN18RR
Hits@1059.1
175
Link PredictionUMLS
Hits@1099.2
56
Showing 5 of 5 rows

Other info

Code

Follow for update