Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay

About

Graph Neural Networks (GNNs) have recently received significant research attention due to their superior performance on a variety of graph-related learning tasks. Most of the current works focus on either static or dynamic graph settings, addressing a single particular task, e.g., node/graph classification, link prediction. In this work, we investigate the question: can GNNs be applied to continuously learning a sequence of tasks? Towards that, we explore the Continual Graph Learning (CGL) paradigm and present the Experience Replay based framework ER-GNN for CGL to alleviate the catastrophic forgetting problem in existing GNNs. ER-GNN stores knowledge from previous tasks as experiences and replays them when learning new tasks to mitigate the catastrophic forgetting issue. We propose three experience node selection strategies: mean of feature, coverage maximization, and influence maximization, to guide the process of selecting experience nodes. Extensive experiments on three benchmark datasets demonstrate the effectiveness of our ER-GNN and shed light on the incremental graph (non-Euclidean) structure learning.

Fan Zhou, Chengtai Cao• 2020

Related benchmarks

TaskDatasetResultRank
Node ClassificationReddit (test)--
137
Node ClassificationACM--
126
Node ClassificationKindle
F1 (AP)78.64
45
Node ClassificationDBLP
F1-AP78.02
45
Node ClassificationCoraFull (test)
Final AP44.19
33
Graph Continual LearningCoraFull (test)
AA34.5
28
Graph Continual LearningarXiv (test)
AA21.5
16
Graph Continual LearningProducts (test)
AA48.3
16
Graph Continual LearningReddit (test)
AA82.7
16
Node ClassificationKindle (test)
F1 (AP)73.67
15
Showing 10 of 27 rows

Other info

Follow for update