Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning Graph-Level Representations with Recurrent Neural Networks

About

Recently a variety of methods have been developed to encode graphs into low-dimensional vectors that can be easily exploited by machine learning algorithms. The majority of these methods start by embedding the graph nodes into a low-dimensional vector space, followed by using some scheme to aggregate the node embeddings. In this work, we develop a new approach to learn graph-level representations, which includes a combination of unsupervised and supervised learning components. We start by learning a set of node representations in an unsupervised fashion. Graph nodes are mapped into node sequences sampled from random walk approaches approximated by the Gumbel-Softmax distribution. Recurrent neural network (RNN) units are modified to accommodate both the node representations as well as their neighborhood information. Experiments on standard graph classification benchmarks demonstrate that our proposed approach achieves superior or comparable performance relative to the state-of-the-art algorithms in terms of convergence speed and classification accuracy. We further illustrate the effectiveness of the different components used by our approach.

Yu Jin, Joseph F. JaJa• 2018

Related benchmarks

TaskDatasetResultRank
Graph ClassificationIMDB-B (10-fold cross-validation)
Accuracy73.8
148
Graph ClassificationIMDB-M (10-fold cross-validation)
Accuracy51.19
84
Graph ClassificationCOLLAB (10-fold cross val)
Accuracy81.75
26
Graph ClassificationREDDIT-5K (10-fold cross val)
Accuracy0.5228
11
Graph ClassificationREDDIT-B (10-fold cross val)
Accuracy86.5
9
Graph ClassificationREDDIT-12K (10-fold cross validation)
Accuracy42.47
9
Showing 6 of 6 rows

Other info

Follow for update