Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Learning to Make Predictions on Graphs with Autoencoders

About

We examine two fundamental tasks associated with graph representation learning: link prediction and semi-supervised node classification. We present a novel autoencoder architecture capable of learning a joint representation of both local graph structure and available node features for the multi-task learning of link prediction and node classification. Our autoencoder architecture is efficiently trained end-to-end in a single learning stage to simultaneously perform link prediction and node classification, whereas previous related methods require multiple training steps that are difficult to optimize. We provide a comprehensive empirical evaluation of our models on nine benchmark graph-structured datasets and demonstrate significant improvement over related methods for graph representation learning. Reference code and data are available at https://github.com/vuptran/graph-representation-learning

Phi Vu Tran• 2018

Related benchmarks

TaskDatasetResultRank
Node ClassificationPubMed (test)
Accuracy79.4
500
Link PredictionCiteseer
AUC95.6
146
Node ClassificationCora standard (test)
Accuracy78.3
130
Link PredictionPubmed
AUC96
123
Node ClassificationCiteseer standard (test)
Accuracy71.6
121
Link PredictionCora
AUC0.943
116
Link PredictionPROTEIN
AUC0.861
4
Link PredictionMetabolic
AUC0.75
4
Link PredictionConflict
AUC69.9
4
Link PredictionPowerGrid
AUC0.781
2
Showing 10 of 10 rows

Other info

Code

Follow for update