Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Deep Graph Contrastive Representation Learning

About

Graph representation learning nowadays becomes fundamental in analyzing graph-structured data. Inspired by recent success of contrastive methods, in this paper, we propose a novel framework for unsupervised graph representation learning by leveraging a contrastive objective at the node level. Specifically, we generate two graph views by corruption and learn node representations by maximizing the agreement of node representations in these two views. To provide diverse node contexts for the contrastive objective, we propose a hybrid scheme for generating graph views on both structure and attribute levels. Besides, we provide theoretical justification behind our motivation from two perspectives, mutual information and the classical triplet loss. We perform empirical experiments on both transductive and inductive learning tasks using a variety of real-world datasets. Experimental experiments demonstrate that despite its simplicity, our proposed method consistently outperforms existing state-of-the-art methods by large margins. Moreover, our unsupervised method even surpasses its supervised counterparts on transductive tasks, demonstrating its great potential in real-world applications.

Yanqiao Zhu, Yichen Xu, Feng Yu, Qiang Liu, Shu Wu, Liang Wang• 2020

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy84.79
1215
Node ClassificationCiteseer
Accuracy71.72
931
Node ClassificationCora (test)
Mean Accuracy88.56
861
Node ClassificationCiteseer (test)
Accuracy0.7379
824
Node ClassificationPubmed
Accuracy87.04
819
Node ClassificationChameleon
Accuracy68.25
640
Node ClassificationSquirrel
Accuracy53.15
591
Node ClassificationPubMed (test)
Accuracy86.21
546
Node Classificationogbn-arxiv (test)
Accuracy65.1
433
Node ClassificationPubmed
Accuracy80.6
396
Showing 10 of 128 rows
...

Other info

Code

Follow for update