Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Understanding and Resolving Performance Degradation in Graph Convolutional Networks

About

A Graph Convolutional Network (GCN) stacks several layers and in each layer performs a PROPagation operation (PROP) and a TRANsformation operation (TRAN) for learning node representations over graph-structured data. Though powerful, GCNs tend to suffer performance drop when the model gets deep. Previous works focus on PROPs to study and mitigate this issue, but the role of TRANs is barely investigated. In this work, we study performance degradation of GCNs by experimentally examining how stacking only TRANs or PROPs works. We find that TRANs contribute significantly, or even more than PROPs, to declining performance, and moreover that they tend to amplify node-wise feature variance in GCNs, causing variance inflammation that we identify as a key factor for causing performance drop. Motivated by such observations, we propose a variance-controlling technique termed Node Normalization (NodeNorm), which scales each node's features using its own standard deviation. Experimental results validate the effectiveness of NodeNorm on addressing performance degradation of GCNs. Specifically, it enables deep GCNs to outperform shallow ones in cases where deep models are needed, and to achieve comparable results with shallow ones on 6 benchmark datasets. NodeNorm is a generic plug-in and can well generalize to other GNN architectures. Code is publicly available at https://github.com/miafei/NodeNorm.

Kuangqi Zhou, Yanfei Dong, Kaixin Wang, Wee Sun Lee, Bryan Hooi, Huan Xu, Jiashi Feng• 2020

Related benchmarks

TaskDatasetResultRank
Graph RegressionZINC 12K (test)
MAE0.2119
164
Graph ClassificationMolHIV
ROC AUC75.5
82
Graph ClassificationMOLTOX21
ROC-AUC0.7327
38
Molecular property predictionMOLESOL
RMSE1.068
37
Graph ClassificationMOLBACE
ROC AUC0.7567
31
Showing 5 of 5 rows

Other info

Follow for update