Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Edge Contraction Pooling for Graph Neural Networks

About

Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.

Frederik Diehl• 2019

Related benchmarks

TaskDatasetResultRank
Graph ClassificationPROTEINS
Accuracy74
994
Graph ClassificationMUTAG
Accuracy84
862
Graph ClassificationNCI1
Accuracy77
501
Graph ClassificationCOLLAB
Accuracy72
422
Graph ClassificationENZYMES
Accuracy35
318
Graph ClassificationDD
Accuracy73
273
Graph ClassificationMUTAG (10-fold cross-validation)
Accuracy74.17
219
Graph ClassificationMutag (test)
Accuracy81.41
217
Graph ClassificationPROTEINS (10-fold cross-validation)
Accuracy75.12
214
Graph ClassificationPROTEINS (test)
Accuracy82.38
180
Showing 10 of 34 rows

Other info

Follow for update