Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FedGEMS: Federated Learning of Larger Server Models via Selective Knowledge Fusion

About

Today data is often scattered among billions of resource-constrained edge devices with security and privacy constraints. Federated Learning (FL) has emerged as a viable solution to learn a global model while keeping data private, but the model complexity of FL is impeded by the computation resources of edge nodes. In this work, we investigate a novel paradigm to take advantage of a powerful server model to break through model capacity in FL. By selectively learning from multiple teacher clients and itself, a server model develops in-depth knowledge and transfers its knowledge back to clients in return to boost their respective performance. Our proposed framework achieves superior performance on both server and client models and provides several advantages in a unified framework, including flexibility for heterogeneous client architectures, robustness to poisoning attacks, and communication efficiency between clients and server on various image classification tasks.

Sijie Cheng, Jingwen Wu, Yanghua Xiao, Yang Liu, Yang Liu• 2021

Related benchmarks

TaskDatasetResultRank
Text-to-Image RetrievalFlickr30k (test)
Recall@121.02
445
Image ClassificationCIFAR-100--
435
Image-to-Text RetrievalFlickr30k (test)
R@125.98
392
Text ClassificationAG News (test)--
228
Image ClassificationCIFAR-100 (test)
Acc22.84
110
Text ClassificationAGNews
Accuracy85.63
61
Text-to-Image RetrievalMS COCO 1K
R@125.64
51
Cross-modal retrievalFlickr30k (test)
Image-to-text Recall@118.93
25
Image-to-Text RetrievalMS-COCO 1K image folds (test)
R@1 (i2t)33.04
8
Cross-modal retrievalMS-COCO (test)
R@1 (I2T)33.12
8
Showing 10 of 11 rows

Other info

Follow for update