Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Edge Contraction Pooling for Graph Neural Networks

About

Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.

Frederik Diehl• 2019

Related benchmarks

TaskDatasetResultRank
Graph ClassificationNCI1
Accuracy76
460
Graph ClassificationMutag (test)
Accuracy81.41
217
Graph ClassificationMUTAG (10-fold cross-validation)
Accuracy74.17
206
Graph ClassificationPROTEINS (10-fold cross-validation)
Accuracy75.12
197
Graph ClassificationPROTEINS (test)
Accuracy82.38
180
Graph ClassificationNCI1 (test)
Accuracy76.56
174
Graph ClassificationMolHIV
ROC AUC75
82
Graph ClassificationENZYMES (test)
Accuracy65.33
77
Graph ClassificationREDDIT-B
Accuracy90
71
Graph ClassificationNCI109 (test)
Accuracy79.02
64
Showing 10 of 25 rows

Other info

Follow for update