Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph-Adaptive Activation Functions for Graph Neural Networks

About

Activation functions are crucial in graph neural networks (GNNs) as they allow defining a nonlinear family of functions to capture the relationship between the input graph data and their representations. This paper proposes activation functions for GNNs that not only adapt to the graph into the nonlinearity, but are also distributable. To incorporate the feature-topology coupling into all GNN components, nodal features are nonlinearized and combined with a set of trainable parameters in a form akin to graph convolutions. The latter leads to a graph-adaptive trainable nonlinear component of the GNN that can be implemented directly or via kernel transformations, therefore, enriching the class of functions to represent the network data. Whether in the direct or kernel form, we show permutation equivariance is always preserved. We also prove the subclass of graph-adaptive max activation functions are Lipschitz stable to input perturbations. Numerical experiments with distributed source localization, finite-time consensus, distributed regression, and recommender systems corroborate our findings and show improved performance compared with pointwise as well as state-of-the-art localized nonlinearities.

Bianca Iancu, Luana Ruiz, Alejandro Ribeiro, Elvin Isufi• 2020

Related benchmarks

TaskDatasetResultRank
Graph ClassificationPROTEINS
Accuracy75.9
742
Graph ClassificationMUTAG
Accuracy92
697
Graph ClassificationNCI1
Accuracy83.6
460
Graph ClassificationNCI109
Accuracy82.8
223
Graph ClassificationPTC
Accuracy67.7
167
Graph ClassificationMOLTOX21
ROC-AUC0.755
38
Graph ClassificationMOLBACE
ROC AUC0.7726
31
Regressionmolesol OGB
RMSE1.049
26
RegressionZINC 12K (test)
MAE0.1661
15
Graph ClassificationOGB-MOLHIV
ROC-AUC0.7344
15
Showing 10 of 10 rows

Other info

Follow for update