Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

LLM as Prompter: Low-resource Inductive Reasoning on Arbitrary Knowledge Graphs

About

Knowledge Graph (KG) inductive reasoning, which aims to infer missing facts from new KGs that are not seen during training, has been widely adopted in various applications. One critical challenge of KG inductive reasoning is handling low-resource scenarios with scarcity in both textual and structural aspects. In this paper, we attempt to address this challenge with Large Language Models (LLMs). Particularly, we utilize the state-of-the-art LLMs to generate a graph-structural prompt to enhance the pre-trained Graph Neural Networks (GNNs), which brings us new methodological insights into the KG inductive reasoning methods, as well as high generalizability in practice. On the methodological side, we introduce a novel pretraining and prompting framework ProLINK, designed for low-resource inductive reasoning across arbitrary KGs without requiring additional training. On the practical side, we experimentally evaluate our approach on 36 low-resource KG datasets and find that ProLINK outperforms previous methods in three-shot, one-shot, and zero-shot reasoning tasks, exhibiting average performance improvements by 20%, 45%, and 147%, respectively. Furthermore, ProLINK demonstrates strong robustness for various LLM promptings as well as full-shot scenarios.

Kai Wang, Yuwei Xu, Zhiyong Wu, Siqiang Luo• 2024

Related benchmarks

TaskDatasetResultRank
Inductive Link PredictionNELL V2
Hit@1078.7
11
Inductive Link PredictionFB v2
Hit@1074.5
11
Inductive Link PredictionFB v3
Hit@1068.3
11
Inductive Link PredictionNELL V3
Hit@100.762
11
Inductive Link PredictionNELL V4
Hit@1076.9
11
Inductive Link PredictionFB v1
Hit@100.692
11
Inductive Link PredictionWN v4
Hit@1073.3
11
Knowledge Graph ReasoningIndE 12 datasets (test)
Hit@1073.3
11
Inductive Link PredictionNELL V1
Hit@1088.3
11
Inductive Link PredictionWN v1
Hit@1078.8
11
Showing 10 of 35 rows

Other info

Follow for update