Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Heterogeneity-Aware Client Sampling for Optimal and Efficient Federated Learning

About

Federated learning (FL) commonly involves clients with diverse communication and computational capabilities. Such heterogeneity can significantly distort the optimization dynamics and lead to objective inconsistency, where the global model converges to an incorrect stationary point potentially far from the pursued optimum. Despite its critical impact, the joint effect of communication and computation heterogeneity has remained largely unexplored, due to the intrinsic complexity of their interaction. In this paper, we reveal the fundamentally distinct mechanisms through which heterogeneous communication and computation drive inconsistency in FL. To the best of our knowledge, this is the first unified theoretical analysis of general heterogeneous FL, offering a principled understanding of how these two forms of heterogeneity jointly distort the optimization trajectory under arbitrary choices of local solvers. Motivated by these insights, we propose Federated Heterogeneity-Aware Client Sampling, FedACS, a universal method to eliminate all types of objective inconsistency. We theoretically prove that FedACS converges to the correct optimum at a rate of $O(1/\sqrt{R})$, even in dynamic heterogeneous environments. Extensive experiments across multiple datasets show that FedACS outperforms state-of-the-art and category-specific baselines by 4.3%-36%, while reducing communication costs by 22%-89% and computation loads by 14%-105%, respectively.

Shudi Weng, Chao Ren, Ming Xiao, Mikael Skoglund• 2025

Related benchmarks

TaskDatasetResultRank
Federated Learning Communication EfficiencyCIFAR10 (test)
Communication Rounds100
50
Federated Learning EfficiencyMNIST (test)
Rounds68
6
Federated Learning EfficiencyCINIC-10 (test)
Communication Rounds55
6
Showing 3 of 3 rows

Other info

Follow for update