Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Federated Learning Based on Dynamic Regularization

About

We propose a novel federated learning method for distributively training neural network models, where the server orchestrates cooperation between a subset of randomly chosen devices in each round. We view Federated Learning problem primarily from a communication perspective and allow more device level computations to save transmission costs. We point out a fundamental dilemma, in that the minima of the local-device level empirical loss are inconsistent with those of the global empirical loss. Different from recent prior works, that either attempt inexact minimization or utilize devices for parallelizing gradient computation, we propose a dynamic regularizer for each device at each round, so that in the limit the global and device solutions are aligned. We demonstrate both through empirical results on real and synthetic data as well as analytical results that our scheme leads to efficient training, in both convex and non-convex settings, while being fully agnostic to device heterogeneity and robust to large number of devices, partial participation and unbalanced data.

Durmus Alp Emre Acar, Yue Zhao, Ramon Matas Navarro, Matthew Mattina, Paul N. Whatmough, Venkatesh Saligrama• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy86.03
3381
Image ClassificationCIFAR-100
Top-1 Accuracy61.09
622
Image ClassificationCIFAR10 (test)
Accuracy84.39
585
Image ClassificationCIFAR-10
Accuracy63.78
471
Image ClassificationCIFAR-100--
302
Image ClassificationTiny ImageNet (test)
Accuracy41.77
265
Image ClassificationPACS (test)
Average Accuracy73.19
254
Image ClassificationDomainNet (test)
Average Accuracy70.02
209
ClassificationfMNIST (test)
Accuracy84.73
149
Image ClassificationTiny-ImageNet
Top-1 Accuracy47.72
143
Showing 10 of 101 rows
...

Other info

Follow for update