Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Why Does Dropping Edges Usually Outperform Adding Edges in Graph Contrastive Learning?

About

Graph contrastive learning (GCL) has been widely used as an effective self-supervised learning method for graph representation learning. However, how to apply adequate and stable graph augmentation to generating proper views for contrastive learning remains an essential problem. Dropping edges is a primary augmentation in GCL while adding edges is not a common method due to its unstable performance. To our best knowledge, there is no theoretical analysis to study why dropping edges usually outperforms adding edges. To answer this question, we introduce a new metric, namely Error Passing Rate (EPR), to quantify how a graph fits the network. Inspired by the theoretical conclusions and the idea of positive-incentive noise, we propose a novel GCL algorithm, Error-PAssing-based Graph Contrastive Learning (EPAGCL), which uses both edge adding and edge dropping as its augmentations. To be specific, we generate views by adding and dropping edges based on the weights derived from EPR. Extensive experiments on various real-world datasets are conducted to validate the correctness of our theoretical analysis and the effectiveness of our proposed algorithm. Our code is available at: https://github.com/hyzhang98/EPAGCL.

Yanchen Xu, Siqi Huang, Hongyuan Zhang, Xuelong Li• 2024

Related benchmarks

TaskDatasetResultRank
Node ClassificationCiteseer
Accuracy71.94
931
Node ClassificationCora (test)
Mean Accuracy82.1
861
Node ClassificationCiteseer (test)
Accuracy0.7194
824
Node ClassificationPubmed
Accuracy81.28
819
Node ClassificationChameleon
Accuracy35.43
640
Node ClassificationWisconsin
Accuracy63.73
627
Node ClassificationTexas
Accuracy0.7351
616
Node ClassificationSquirrel
Accuracy40.28
591
Node ClassificationCornell
Accuracy55.68
582
Node ClassificationActor
Accuracy30.05
397
Showing 10 of 20 rows

Other info

Follow for update