Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Pre-Training on Dynamic Graph Neural Networks

About

The pre-training on the graph neural network model can learn the general features of large-scale networks or networks of the same type by self-supervised methods, which allows the model to work even when node labels are missing. However, the existing pre-training methods do not take network evolution into consideration. This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN), which uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph. The method includes two steps: 1) dynamic sub-graph sampling, and 2) pre-training with dynamic attributed graph generation task. Comparative experiments on three realistic dynamic network datasets show that the proposed method achieves the best results on the link prediction fine-tuning task.

Ke-jia Chen, Jiajun Zhang, Linpu Jiang, Yunyun Wang, Yuxuan Dai• 2021

Related benchmarks

TaskDatasetResultRank
Temporal Link PredictionICEWS1819 transductive
ROC-AUC0.9083
8
Temporal Link PredictionICEWS inductive 1819
ROC-AUC71.06
8
Temporal Link PredictionGooglemap CT transductive
ROC-AUC0.6699
8
Temporal Link PredictionGooglemap CT inductive
ROC-AUC (%)51.2
8
Showing 4 of 4 rows

Other info

Follow for update