Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

LoRAP: Low-Rank Aggregation Prompting for Quantized Graph Neural Networks Training

About

Graph Neural Networks (GNNs) are neural networks that aim to process graph data, capturing the relationships and interactions between nodes using the message-passing mechanism. GNN quantization has emerged as a promising approach for reducing model size and accelerating inference in resource-constrained environments. Compared to quantization in LLMs, quantizing graph features is more emphasized in GNNs. Inspired by the above, we propose to leverage prompt learning, which manipulates the input data, to improve the performance of quantization-aware training (QAT) for GNNs. To mitigate the issue that prompting the node features alone can only make part of the quantized aggregation result optimal, we introduce Low-Rank Aggregation Prompting (LoRAP), which injects lightweight, input-dependent prompts into each aggregated feature to optimize the results of quantized aggregations. Extensive evaluations on 4 leading QAT frameworks over 9 graph datasets demonstrate that LoRAP consistently enhances the performance of low-bit quantized GNNs while introducing a minimal computational overhead.

Chenyu Liu, Haige Li, Luca Rossi• 2026

Related benchmarks

TaskDatasetResultRank
Node ClassificationOgbn-arxiv
Accuracy73.6
191
Graph ClassificationCIFAR10
Accuracy62.9
108
Graph ClassificationREDDIT BINARY
Accuracy76.6
107
Graph ClassificationMNIST
Accuracy96.4
95
Graph RegressionZINC
MSE0.361
49
Graph ClassificationREDDIT-B (test)
Accuracy94.6
32
Node ClassificationCora
Accuracy79.1
18
Node Classificationogbn-mag
Accuracy34.2
11
Node ClassificationOgbn-arxiv
Accuracy73.6
9
Graph ClassificationMNIST
Accuracy96.4
5
Showing 10 of 11 rows

Other info

Follow for update