Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Communication-Efficient Adaptive Federated Learning

About

Federated learning is a machine learning training paradigm that enables clients to jointly train models without sharing their own localized data. However, the implementation of federated learning in practice still faces numerous challenges, such as the large communication overhead due to the repetitive server-client synchronization and the lack of adaptivity by SGD-based model updates. Despite that various methods have been proposed for reducing the communication cost by gradient compression or quantization, and the federated versions of adaptive optimizers such as FedAdam are proposed to add more adaptivity, the current federated learning framework still cannot solve the aforementioned challenges all at once. In this paper, we propose a novel communication-efficient adaptive federated learning method (FedCAMS) with theoretical convergence guarantees. We show that in the nonconvex stochastic optimization setting, our proposed FedCAMS achieves the same convergence rate of $O(\frac{1}{\sqrt{TKm}})$ as its non-compressed counterparts. Extensive experiments on various benchmarks verify our theoretical analysis.

Yujia Wang, Lu Lin, Jinghui Chen• 2022

Related benchmarks

TaskDatasetResultRank
Question AnsweringSQuAD
F178.02
127
Multi-Sentence Reading ComprehensionMultiRC
F181.76
16
Recognizing Textual EntailmentRTE
Accuracy81.22
16
CommitmentBankCB
Accuracy82.38
16
Text ClassificationBANKING
Total Communication Time (10^3 s)1.89
9
Recognizing Textual EntailmentRTE
Total Communication Time ($10^3$ s)4.58
9
Reading ComprehensionMultiRC
Total Communication Time1.46e+4
9
Text ClassificationAG-News
Total Communication Time (10^3 s)3.57
9
Text ClassificationTREC
Total Comm Time ($10^3$ s)1
9
Natural Language InferenceCB
Total Communication Time (10^3 s)6.43
9
Showing 10 of 12 rows

Other info

Follow for update