Pre-Training on Dynamic Graph Neural Networks
About
The pre-training on the graph neural network model can learn the general features of large-scale networks or networks of the same type by self-supervised methods, which allows the model to work even when node labels are missing. However, the existing pre-training methods do not take network evolution into consideration. This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN), which uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph. The method includes two steps: 1) dynamic sub-graph sampling, and 2) pre-training with dynamic attributed graph generation task. Comparative experiments on three realistic dynamic network datasets show that the proposed method achieves the best results on the link prediction fine-tuning task.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Temporal Link Prediction | ICEWS1819 transductive | ROC-AUC0.9083 | 8 | |
| Temporal Link Prediction | ICEWS inductive 1819 | ROC-AUC71.06 | 8 | |
| Temporal Link Prediction | Googlemap CT transductive | ROC-AUC0.6699 | 8 | |
| Temporal Link Prediction | Googlemap CT inductive | ROC-AUC (%)51.2 | 8 |