Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

FedHe: Heterogeneous Models and Communication-Efficient Federated Learning

About

Federated learning (FL) is able to manage edge devices to cooperatively train a model while maintaining the training data local and private. One common assumption in FL is that all edge devices share the same machine learning model in training, for example, identical neural network architecture. However, the computation and store capability of different devices may not be the same. Moreover, reducing communication overheads can improve the training efficiency though it is still a challenging problem in FL. In this paper, we propose a novel FL method, called FedHe, inspired by knowledge distillation, which can train heterogeneous models and support asynchronous training processes with significantly reduced communication overheads. Our analysis and experimental results demonstrate that the performance of our proposed method is better than the state-of-the-art algorithms in terms of communication overheads and model accuracy.

Chan Yun Hin, Ngai Edith• 2021

Related benchmarks

TaskDatasetResultRank
WSI ClassificationCAMELYON17 (test)--
33
WSI ClassificationCAMELYON16 (test)
Avg Acc70.2
28
Showing 2 of 2 rows

Other info

Follow for update