Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Stochastic Training of Graph Convolutional Networks with Variance Reduction

About

Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes the representation of a node recursively from its neighbors, making the receptive field size grow exponentially with the number of layers. Previous attempts on reducing the receptive field size by subsampling neighbors do not have a convergence guarantee, and their receptive field size per node is still in the order of hundreds. In this paper, we develop control variate based algorithms which allow sampling an arbitrarily small neighbor size. Furthermore, we prove new theoretical guarantee for our algorithms to converge to a local optimum of GCN. Empirical results show that our algorithms enjoy a similar convergence with the exact algorithm using only two neighbors per node. The runtime of our algorithms on a large Reddit dataset is only one seventh of previous neighbor sampling algorithms.

Jianfei Chen, Jun Zhu, Le Song• 2017

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora (test)--
687
Node ClassificationPubMed (test)--
500
Node ClassificationReddit (test)
Accuracy96.3
134
Node ClassificationPPI (test)
F1 (micro)74.4
126
Node ClassificationFlickr (test)
Micro F150.1
57
Node ClassificationReddit inductive (test)
Micro F196.4
29
Node ClassificationYelp (test)
Micro F164
26
Inductive Node ClassificationPPI (test)
Micro F1 Score97.8
19
Node ClassificationPPI (inductive)
Micro F1 Score96.3
12
Node ClassificationFlickr (inductive)
Micro F148.2
12
Showing 10 of 12 rows

Other info

Follow for update