Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Graph Condensation for Graph Neural Networks

About

Given the prevalence of large-scale graphs in real-world applications, the storage and time for training neural models have raised increasing concerns. To alleviate the concerns, we propose and study the problem of graph condensation for graph neural networks (GNNs). Specifically, we aim to condense the large, original graph into a small, synthetic and highly-informative graph, such that GNNs trained on the small graph and large graph have comparable performance. We approach the condensation problem by imitating the GNN training trajectory on the original graph through the optimization of a gradient matching loss and design a strategy to condense node futures and structural information simultaneously. Extensive experiments have demonstrated the effectiveness of the proposed framework in condensing different graph datasets into informative smaller graphs. In particular, we are able to approximate the original test accuracy by 95.3% on Reddit, 99.8% on Flickr and 99.0% on Citeseer, while reducing their graph size by more than 99.9%, and the condensed graphs can be used to train various GNN architectures.Code is released at https://github.com/ChandlerBang/GCond.

Wei Jin, Lingxiao Zhao, Shichang Zhang, Yozen Liu, Jiliang Tang, Neil Shah• 2021

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy83.61
885
Node ClassificationCiteseer
Accuracy73
804
Node ClassificationPubmed
Accuracy77.9
742
Node ClassificationCiteseer (test)
Accuracy0.706
729
Node ClassificationCora (test)
Mean Accuracy80.1
687
Node Classificationogbn-arxiv (test)
Accuracy64
382
Node ClassificationCiteseer
Accuracy72.03
275
Node ClassificationwikiCS
Accuracy79.33
198
Node ClassificationPhysics
Accuracy87.6
145
Node Classificationamazon-ratings
Accuracy43.03
138
Showing 10 of 28 rows

Other info

Follow for update