Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DyG2Vec: Efficient Representation Learning for Dynamic Graphs

About

Temporal graph neural networks have shown promising results in learning inductive representations by automatically extracting temporal patterns. However, previous works often rely on complex memory modules or inefficient random walk methods to construct temporal representations. To address these limitations, we present an efficient yet effective attention-based encoder that leverages temporal edge encodings and window-based subgraph sampling to generate task-agnostic embeddings. Moreover, we propose a joint-embedding architecture using non-contrastive SSL to learn rich temporal embeddings without labels. Experimental results on 7 benchmark datasets indicate that on average, our model outperforms SoTA baselines on the future link prediction task by 4.23% for the transductive setting and 3.30% for the inductive setting while only requiring 5-10x less training/inference time. Lastly, different aspects of the proposed framework are investigated through experimental analysis and ablation studies. The code is publicly available at https://github.com/huawei-noah/noah-research/tree/master/graph_atlas.

Mohammad Ali Alomrani, Mahdi Biparva, Yingxue Zhang, Mark Coates• 2022

Related benchmarks

TaskDatasetResultRank
Link PredictionReddit (inductive)
AP99.1
52
Link PredictionEnron (inductive)
AP97.6
37
Link PredictionReddit (transductive)
AP99.6
30
Link PredictionEnron (transductive)
AP99.1
28
Link PredictionLastFM (transductive)
AP97.9
28
Link PredictionUCI (inductive)
AP97.9
26
Link PredictionLastFM (inductive)
AP98.7
26
Future Link PredictionMOOC (inductive)
AP93.8
19
Transductive future link predictionWikipedia
AP99.5
9
Transductive future link predictionUCI
AP98.8
9
Showing 10 of 18 rows

Other info

Code

Follow for update