Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Rethinking Federated Graph Learning: A Data Condensation Perspective

About

Federated graph learning is a widely recognized technique that promotes collaborative training of graph neural networks (GNNs) by multi-client graphs.However, existing approaches heavily rely on the communication of model parameters or gradients for federated optimization and fail to adequately address the data heterogeneity introduced by intricate and diverse graph distributions. Although some methods attempt to share additional messages among the server and clients to improve federated convergence during communication, they introduce significant privacy risks and increase communication overhead. To address these issues, we introduce the concept of a condensed graph as a novel optimization carrier to address FGL data heterogeneity and propose a new FGL paradigm called FedGM. Specifically, we utilize a generalized condensation graph consensus to aggregate comprehensive knowledge from distributed graphs, while minimizing communication costs and privacy risks through a single transmission of the condensed data. Extensive experiments on six public datasets consistently demonstrate the superiority of FedGM over state-of-the-art baselines, highlighting its potential for a novel FGL paradigm.

Hao Zhang, Xunkai Li, Yinlin Zhu, Lianglin Hu• 2025

Related benchmarks

TaskDatasetResultRank
Node ClassificationCora
Accuracy84.15
885
Node ClassificationCiteseer
Accuracy73.82
275
Node ClassificationwikiCS
Accuracy79.92
198
Node Classificationamazon-ratings
Accuracy44.63
138
Node ClassificationREDDIT
Accuracy65.73
66
Node ClassificationarXiv
Accuracy71.72
41
Node ClassificationInstagram
Accuracy64.25
23
Node ClassificationChildren
Accuracy47.36
19
Showing 8 of 8 rows

Other info

Follow for update