Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Dipping PLMs Sauce: Bridging Structure and Text for Effective Knowledge Graph Completion via Conditional Soft Prompting

About

Knowledge Graph Completion (KGC) often requires both KG structural and textual information to be effective. Pre-trained Language Models (PLMs) have been used to learn the textual information, usually under the fine-tune paradigm for the KGC task. However, the fine-tuned PLMs often overwhelmingly focus on the textual information and overlook structural knowledge. To tackle this issue, this paper proposes CSProm-KG (Conditional Soft Prompts for KGC) which maintains a balance between structural information and textual knowledge. CSProm-KG only tunes the parameters of Conditional Soft Prompts that are generated by the entities and relations representations. We verify the effectiveness of CSProm-KG on three popular static KGC benchmarks WN18RR, FB15K-237 and Wikidata5M, and two temporal KGC benchmarks ICEWS14 and ICEWS05-15. CSProm-KG outperforms competitive baseline models and sets new state-of-the-art on these benchmarks. We conduct further analysis to show (i) the effectiveness of our proposed components, (ii) the efficiency of CSProm-KG, and (iii) the flexibility of CSProm-KG.

Chen Chen, Yufei Wang, Aixin Sun, Bing Li, Kwok-Yan Lam• 2023

Related benchmarks

TaskDatasetResultRank
Link PredictionWN18RR (test)
Hits@1067.8
380
Link PredictionFB15k-237
MRR35.8
280
Knowledge Graph CompletionWN18RR
Hits@152.2
165
Knowledge Graph CompletionFB15k-237
Hits@100.538
108
Link PredictionYAGO3-10
MRR0.488
33
Link PredictionPrimeKG
MR157
8
Showing 6 of 6 rows

Other info

Follow for update