Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Jointly Learning Knowledge Embedding and Neighborhood Consensus with Relational Knowledge Distillation for Entity Alignment

About

Entity alignment aims at integrating heterogeneous knowledge from different knowledge graphs. Recent studies employ embedding-based methods by first learning the representation of Knowledge Graphs and then performing entity alignment via measuring the similarity between entity embeddings. However, they failed to make good use of the relation semantic information due to the trade-off problem caused by the different objectives of learning knowledge embedding and neighborhood consensus. To address this problem, we propose Relational Knowledge Distillation for Entity Alignment (RKDEA), a Graph Convolutional Network (GCN) based model equipped with knowledge distillation for entity alignment. We adopt GCN-based models to learn the representation of entities by considering the graph structure and incorporating the relation semantic information into GCN via knowledge distillation. Then, we introduce a novel adaptive mechanism to transfer relational knowledge so as to jointly learn entity embedding and neighborhood consensus. Experimental results on several benchmarking datasets demonstrate the effectiveness of our proposed model.

Xinhang Li, Yong Zhang, Chunxiao Xing• 2022

Related benchmarks

TaskDatasetResultRank
Entity AlignmentDBP15K FR-EN
Hits@10.622
158
Entity AlignmentDBP15K ZH-EN
H@160.3
143
Entity AlignmentDBP15K JA-EN
Hits@10.597
126
Entity AlignmentDWY100K DBP-YG
Hits@182.3
51
Entity AlignmentDWY100K wd
Hits@175.6
15
Showing 5 of 5 rows

Other info

Follow for update