Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Total Variation Graph Neural Networks

About

Recently proposed Graph Neural Networks (GNNs) for vertex clustering are trained with an unsupervised minimum cut objective, approximated by a Spectral Clustering (SC) relaxation. However, the SC relaxation is loose and, while it offers a closed-form solution, it also yields overly smooth cluster assignments that poorly separate the vertices. In this paper, we propose a GNN model that computes cluster assignments by optimizing a tighter relaxation of the minimum cut based on graph total variation (GTV). The cluster assignments can be used directly to perform vertex clustering or to implement graph pooling in a graph classification framework. Our model consists of two core components: i) a message-passing layer that minimizes the $\ell_1$ distance in the features of adjacent vertices, which is key to achieving sharp transitions between clusters; ii) an unsupervised loss function that minimizes the GTV of the cluster assignments while ensuring balanced partitions. Experimental results show that our model outperforms other GNNs for vertex clustering and graph classification.

Jonas Berg Hansen, Filippo Maria Bianchi• 2022

Related benchmarks

TaskDatasetResultRank
Graph ClassificationNCI1
Accuracy78
501
Node ClusteringCora
Accuracy53.24
133
Node ClusteringCiteseer
NMI27
130
Graph ClassificationMolHIV
ROC AUC75
88
Graph ClassificationREDDIT-B
Accuracy91
84
Graph RegressionPeptides-struct
MAE0.29
76
Graph ClassificationPeptides func
AP75
41
Node ClusteringDBLP
NMI26
39
Graph ClassificationMultipartite
Accuracy0.62
30
ClusteringDBLP
Accuracy50.67
30
Showing 10 of 19 rows

Other info

Follow for update