Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Self-Attention Graph Pooling

About

Advanced methods of applying deep learning to structured data such as graphs have been proposed in recent years. In particular, studies have focused on generalizing convolutional neural networks to graph data, which includes redefining the convolution and the downsampling (pooling) operations for graphs. The method of generalizing the convolution operation to graphs has been proven to improve performance and is widely used. However, the method of applying downsampling to graphs is still difficult to perform and has room for improvement. In this paper, we propose a graph pooling method based on self-attention. Self-attention using graph convolution allows our pooling method to consider both node features and graph topology. To ensure a fair comparison, the same training procedures and model architectures were used for the existing pooling methods and our method. The experimental results demonstrate that our method achieves superior graph classification performance on the benchmark datasets using a reasonable number of parameters.

Junhyun Lee, Inyeop Lee, Jaewoo Kang• 2019

Related benchmarks

TaskDatasetResultRank
Graph ClassificationPROTEINS
Accuracy72.02
742
Graph ClassificationMUTAG
Accuracy76.78
697
Graph ClassificationNCI1
Accuracy72
460
Graph ClassificationCOLLAB
Accuracy78.85
329
Graph ClassificationIMDB-B
Accuracy71.86
322
Graph ClassificationNCI109
Accuracy67.86
223
Graph ClassificationMutag (test)
Accuracy79.72
217
Graph ClassificationMUTAG (10-fold cross-validation)
Accuracy73.67
206
Graph ClassificationPROTEINS (10-fold cross-validation)
Accuracy71.56
197
Graph ClassificationPROTEINS (test)
Accuracy81.72
180
Showing 10 of 53 rows

Other info

Code

Follow for update