Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Total Variation Graph Neural Networks

About

Recently proposed Graph Neural Networks (GNNs) for vertex clustering are trained with an unsupervised minimum cut objective, approximated by a Spectral Clustering (SC) relaxation. However, the SC relaxation is loose and, while it offers a closed-form solution, it also yields overly smooth cluster assignments that poorly separate the vertices. In this paper, we propose a GNN model that computes cluster assignments by optimizing a tighter relaxation of the minimum cut based on graph total variation (GTV). The cluster assignments can be used directly to perform vertex clustering or to implement graph pooling in a graph classification framework. Our model consists of two core components: i) a message-passing layer that minimizes the $\ell_1$ distance in the features of adjacent vertices, which is key to achieving sharp transitions between clusters; ii) an unsupervised loss function that minimizes the GTV of the cluster assignments while ensuring balanced partitions. Experimental results show that our model outperforms other GNNs for vertex clustering and graph classification.

Jonas Berg Hansen, Filippo Maria Bianchi• 2022

Related benchmarks

TaskDatasetResultRank
Graph ClassificationNCI1
Accuracy78
460
Node ClusteringCora
Accuracy53.24
115
Node ClusteringCiteseer
NMI27
110
Graph ClassificationMolHIV
ROC AUC75
82
Graph ClassificationREDDIT-B
Accuracy91
71
Graph RegressionPeptides-struct
MAE0.29
51
Node ClusteringDBLP
NMI26
39
ClusteringDBLP
Accuracy50.67
27
Graph ClassificationPeptides func
AP75
22
Graph ClassificationMultipartite
Accuracy0.62
17
Showing 10 of 19 rows

Other info

Follow for update