Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

From Canonical Correlation Analysis to Self-supervised Graph Neural Networks

About

We introduce a conceptually simple yet effective model for self-supervised representation learning with graph data. It follows the previous methods that generate two views of an input graph through data augmentation. However, unlike contrastive methods that focus on instance-level discrimination, we optimize an innovative feature-level objective inspired by classical Canonical Correlation Analysis. Compared with other works, our approach requires none of the parameterized mutual information estimator, additional projector, asymmetric structures, and most importantly, negative samples which can be costly. We show that the new objective essentially 1) aims at discarding augmentation-variant information by learning invariant representations, and 2) can prevent degenerated solutions by decorrelating features in different dimensions. Our theoretical analysis further provides an understanding for the new objective which can be equivalently seen as an instantiation of the Information Bottleneck Principle under the self-supervised setting. Despite its simplicity, our method performs competitively on seven public graph datasets. The code is available at: https://github.com/hengruizhang98/CCA-SSG.

Hengrui Zhang, Qitian Wu, Junchi Yan, David Wipf, Philip S. Yu• 2021

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy84
1215
Node ClassificationCiteseer
Accuracy73.1
931
Node ClassificationCora (test)
Mean Accuracy87.39
861
Node ClassificationCiteseer (test)
Accuracy0.796
824
Node ClassificationPubmed
Accuracy81.1
819
Node ClassificationChameleon
Accuracy39.46
640
Node ClassificationWisconsin
Accuracy58.46
627
Node ClassificationTexas
Accuracy0.5989
616
Node ClassificationSquirrel
Accuracy41.23
591
Node ClassificationCornell
Accuracy52.17
582
Showing 10 of 55 rows

Other info

Code

Follow for update