Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

A Cross-graph Tuning-free GNN Prompting Framework

About

GNN prompting aims to adapt models across tasks and graphs without requiring extensive retraining. However, most existing graph prompt methods still require task-specific parameter updates and face the issue of generalizing across graphs, limiting their performance and undermining the core promise of prompting. In this work, we introduce a Cross-graph Tuning-free Prompting Framework (CTP), which supports both homogeneous and heterogeneous graphs, can be directly deployed to unseen graphs without further parameter tuning, and thus enables a plug-and-play GNN inference engine. Extensive experiments on few-shot prediction tasks show that, compared to SOTAs, CTP achieves an average accuracy gain of 30.8% and a maximum gain of 54%, confirming its effectiveness and offering a new perspective on graph prompt learning.

Yaqi Chen, Shixun Huang, Ryan Twemlow, Lei Wang, John Le, Sheng Wang, Willy Susilo, Jun Yan, Jun Shen• 2026

Related benchmarks

TaskDatasetResultRank
Link PredictionFB15k-237 (test)--
419
Node ClassificationarXiv
Accuracy66.09
219
Link PredictionNELL (test)
Accuracy84.32
40
Link PredictionConceptNet (test)
Accuracy46.3
10
Showing 4 of 4 rows

Other info

Follow for update