Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay
About
Graph Neural Networks (GNNs) have recently received significant research attention due to their superior performance on a variety of graph-related learning tasks. Most of the current works focus on either static or dynamic graph settings, addressing a single particular task, e.g., node/graph classification, link prediction. In this work, we investigate the question: can GNNs be applied to continuously learning a sequence of tasks? Towards that, we explore the Continual Graph Learning (CGL) paradigm and present the Experience Replay based framework ER-GNN for CGL to alleviate the catastrophic forgetting problem in existing GNNs. ER-GNN stores knowledge from previous tasks as experiences and replays them when learning new tasks to mitigate the catastrophic forgetting issue. We propose three experience node selection strategies: mean of feature, coverage maximization, and influence maximization, to guide the process of selecting experience nodes. Extensive experiments on three benchmark datasets demonstrate the effectiveness of our ER-GNN and shed light on the incremental graph (non-Euclidean) structure learning.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Node Classification | Reddit (test) | -- | 137 | |
| Node Classification | ACM | -- | 126 | |
| Node Classification | Kindle | F1 (AP)78.64 | 45 | |
| Node Classification | DBLP | F1-AP78.02 | 45 | |
| Node Classification | CoraFull (test) | Final AP44.19 | 33 | |
| Graph Continual Learning | CoraFull (test) | AA34.5 | 28 | |
| Graph Continual Learning | arXiv (test) | AA21.5 | 16 | |
| Graph Continual Learning | Products (test) | AA48.3 | 16 | |
| Graph Continual Learning | Reddit (test) | AA82.7 | 16 | |
| Node Classification | Kindle (test) | F1 (AP)73.67 | 15 |