Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Hypergraph Self-supervised Learning with Sampling-efficient Signals

About

Self-supervised learning (SSL) provides a promising alternative for representation learning on hypergraphs without costly labels. However, existing hypergraph SSL models are mostly based on contrastive methods with the instance-level discrimination strategy, suffering from two significant limitations: (1) They select negative samples arbitrarily, which is unreliable in deciding similar and dissimilar pairs, causing training bias. (2) They often require a large number of negative samples, resulting in expensive computational costs. To address the above issues, we propose SE-HSSL, a hypergraph SSL framework with three sampling-efficient self-supervised signals. Specifically, we introduce two sampling-free objectives leveraging the canonical correlation analysis as the node-level and group-level self-supervised signals. Additionally, we develop a novel hierarchical membership-level contrast objective motivated by the cascading overlap relationship in hypergraphs, which can further reduce membership sampling bias and improve the efficiency of sample utilization. Through comprehensive experiments on 7 real-world hypergraphs, we demonstrate the superiority of our approach over the state-of-the-art method in terms of both effectiveness and efficiency.

Fan Li, Xiaoyang Wang, Dawei Cheng, Wenjie Zhang, Ying Zhang, Xuemin Lin• 2024

Related benchmarks

TaskDatasetResultRank
Node ClusteringCiteseer
NMI43.8
130
Node ClusteringCora-C
Accuracy (ACC)72
7
Node ClusteringCora-A
Accuracy (ACC)61
7
Node ClusteringNTU 2012
Accuracy70.5
7
Node ClusteringMushroom
Accuracy78.3
6
Node Clustering20News W100
Accuracy (ACC)70.2
4
Showing 6 of 6 rows

Other info

Follow for update