Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Multi-Level Branched Regularization for Federated Learning

About

A critical challenge of federated learning is data heterogeneity and imbalance across clients, which leads to inconsistency between local networks and unstable convergence of global models. To alleviate the limitations, we propose a novel architectural regularization technique that constructs multiple auxiliary branches in each local model by grafting local and global subnetworks at several different levels and that learns the representations of the main pathway in the local model congruent to the auxiliary hybrid pathways via online knowledge distillation. The proposed technique is effective to robustify the global model even in the non-iid setting and is applicable to various federated learning frameworks conveniently without incurring extra communication costs. We perform comprehensive empirical studies and demonstrate remarkable performance gains in terms of accuracy and efficiency compared to existing methods. The source code is available at our project page.

Jinkyu Kim, Geeho Kim, Bohyung Han• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationMNIST NN (test)
Communication Rounds22
62
Image ClassificationCIFAR-100 VGG-11 (test)
Communication Rounds63
61
Image ClassificationTiny-Imagenet Resnet20 (test)
Communication Rounds791
48
Image ClassificationCIFAR-10 LeNet-5 (test)
Communication Rounds78
44
Federated LearningTiny ImageNet (test)
Accuracy (500R)28.39
13
Federated LearningCIFAR-100 500 clients, 1% participation Dirichlet 0.3 (train test)
Accuracy (500 Rounds)29.78
13
Image ClassificationCIFAR-100 i.i.d. 500 clients 2% participation (test)
Accuracy (500R)34.56
13
Federated LearningCIFAR-100 100 clients Dirichlet 0.3
Accuracy (500 Rounds)40.09
13
Image ClassificationCIFAR-100 Dirichlet 0.6, 500 clients, 2% participation (test)
Accuracy (500R)33.79
13
Image ClassificationTiny-ImageNet
Accuracy (500R)37.2
13
Showing 10 of 21 rows

Other info

Follow for update