Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Graph Retention Networks for Dynamic Graphs

About

In this paper, we propose Graph Retention Networks (GRNs) as a unified architecture for deep learning on dynamic graphs. The GRN extends the concept of retention into dynamic graph data as graph retention, equipping the model with three key computational paradigms: parallelizable training, low-cost $\mathcal{O}(1)$ inference, and long-term chunkwise training. This architecture achieves an optimal balance between efficiency, effectiveness, and scalability. Extensive experiments on benchmark datasets demonstrate its strong performance in both edge-level prediction and node-level classification tasks with significantly reduced training latency, lower GPU memory overhead, and improved inference throughput by up to 86.7x compared to SOTA baselines. The proposed GRN architecture achieves competitive performance across diverse dynamic graph benchmarks, demonstrating its adaptability to a wide range of tasks.

Qian Chang, Xia Li, Xiufeng Cheng, Runsong Jia, Jinqing Yang, Guoping Hu, Ciprian Doru Giurcaneanu• 2024

Related benchmarks

TaskDatasetResultRank
Node ClassificationREDDIT--
192
Link PredictionReddit (inductive)
AP98.57
81
Link PredictionEnron (inductive)
AP81.39
66
Inductive dynamic link predictionReddit (inductive)
AUC-ROC (%)98.9
65
Link PredictionEnron (transductive)--
49
Dynamic Link PredictionUN Vote
AP83.09
37
Dynamic Link PredictionUN Trade
AP71.54
37
Dynamic Link PredictionUS Legislature
AP79.21
37
Dynamic Link PredictionContact
AP89.34
37
Link PredictionLastFM (inductive)
AP83.44
35
Showing 10 of 51 rows

Other info

Follow for update