Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training

About

Normalization is known to help the optimization of deep neural networks. Curiously, different architectures require specialized normalization methods. In this paper, we study what normalization is effective for Graph Neural Networks (GNNs). First, we adapt and evaluate the existing methods from other domains to GNNs. Faster convergence is achieved with InstanceNorm compared to BatchNorm and LayerNorm. We provide an explanation by showing that InstanceNorm serves as a preconditioner for GNNs, but such preconditioning effect is weaker with BatchNorm due to the heavy batch noise in graph datasets. Second, we show that the shift operation in InstanceNorm results in an expressiveness degradation of GNNs for highly regular graphs. We address this issue by proposing GraphNorm with a learnable shift. Empirically, GNNs with GraphNorm converge faster compared to GNNs using other normalization. GraphNorm also improves the generalization of GNNs, achieving better performance on graph classification benchmarks.

Tianle Cai, Shengjie Luo, Keyulu Xu, Di He, Tie-Yan Liu, Liwei Wang• 2020

Related benchmarks

TaskDatasetResultRank
Graph ClassificationMutag (test)
Accuracy91.6
217
Graph ClassificationPROTEINS (test)
Accuracy77.4
180
Graph ClassificationNCI1 (test)
Accuracy0.814
174
Graph RegressionZINC 12K (test)
MAE0.3104
164
Graph ClassificationIMDB-B (test)
Accuracy76
134
Graph ClassificationCOLLAB (test)
Accuracy80.2
96
Graph ClassificationOGBG-MOLHIV v1 (test)
ROC-AUC0.7883
88
Graph ClassificationMolHIV
ROC AUC78.08
82
Graph ClassificationPTC (test)
Accuracy64.9
49
Graph ClassificationMOLTOX21
ROC-AUC0.7354
38
Showing 10 of 16 rows

Other info

Follow for update