Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients

About

Federated Learning (FL) is a method of training machine learning models on private data distributed over a large number of possibly heterogeneous clients such as mobile phones and IoT devices. In this work, we propose a new federated learning framework named HeteroFL to address heterogeneous clients equipped with very different computation and communication capabilities. Our solution can enable the training of heterogeneous local models with varying computation complexities and still produce a single global inference model. For the first time, our method challenges the underlying assumption of existing work that local models have to share the same architecture as the global model. We demonstrate several strategies to enhance FL training and conduct extensive empirical evaluations, including five computation complexity levels of three model architecture on three datasets. We show that adaptively distributing subnetworks according to clients' capabilities is both computation and communication efficient.

Enmao Diao, Jie Ding, Vahid Tarokh• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy14.32
3518
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationTinyImageNet (test)
Accuracy24.1
440
Image ClassificationSVHN
Accuracy97.5
395
Image ClassificationCIFAR100
Accuracy77.8
347
Image ClassificationPACS
Overall Average Accuracy19.27
241
Image ClassificationCIFAR10 non-iid
Accuracy54.79
162
Image ClassificationCIFAR-100 non-IID (test)
Test Accuracy (Avg Best)12.96
113
Image ClassificationCIFAR10
Top-1 Accuracy94.3
112
Image ClassificationFood-101 (test)--
89
Showing 10 of 32 rows

Other info

Follow for update