Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Subgraph-Aware Training of Language Models for Knowledge Graph Completion Using Structure-Aware Contrastive Learning

About

Fine-tuning pre-trained language models (PLMs) has recently shown a potential to improve knowledge graph completion (KGC). However, most PLM-based methods focus solely on encoding textual information, neglecting the long-tailed nature of knowledge graphs and their various topological structures, e.g., subgraphs, shortest paths, and degrees. We claim that this is a major obstacle to achieving higher accuracy of PLMs for KGC. To this end, we propose a Subgraph-Aware Training framework for KGC (SATKGC) with two ideas: (i) subgraph-aware mini-batching to encourage hard negative sampling and to mitigate an imbalance in the frequency of entity occurrences during training, and (ii) new contrastive learning to focus more on harder in-batch negative triples and harder positive triples in terms of the structural properties of the knowledge graph. To the best of our knowledge, this is the first study to comprehensively incorporate the structural inductive bias of the knowledge graph into fine-tuning PLMs. Extensive experiments on three KGC benchmarks demonstrate the superiority of SATKGC. Our code is available.

Youmin Ko, Hyemin Yang, Taeuk Kim, Hyunjoon Kim• 2024

Related benchmarks

TaskDatasetResultRank
Knowledge Base CompletionWebQSP 30% KB
MRR0.491
16
Knowledge Base CompletionWebQSP 50% KB
MRR51
16
Knowledge Base CompletionCWQ (30% KB)
MRR47.6
16
Knowledge Base CompletionCWQ 50% KB
MRR50.3
16
Showing 4 of 4 rows

Other info

Follow for update