Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Federated Learning with Partial Model Personalization

About

We consider two federated learning algorithms for training partially personalized models, where the shared and personal parameters are updated either simultaneously or alternately on the devices. Both algorithms have been proposed in the literature, but their convergence properties are not fully understood, especially for the alternating variant. We provide convergence analyses of both algorithms in the general nonconvex setting with partial participation and delineate the regime where one dominates the other. Our experiments on real-world image, text, and speech datasets demonstrate that (a) partial personalization can obtain most of the benefits of full model personalization with a small fraction of personal parameters, and, (b) the alternating update algorithm often outperforms the simultaneous update algorithm by a small but consistent margin.

Krishna Pillutla, Kshitiz Malik, Abdelrahman Mohamed, Michael Rabbat, Maziar Sanjabi, Lin Xiao• 2022

Related benchmarks

TaskDatasetResultRank
Personalized Federated LearningDRAKE dynamic (Self)
Alast66.65
40
Personalized Federated LearningDRAKE dynamic (Others)
Alast47.47
40
Personalized Federated LearningDRAKE (Self)
Alast66.75
30
Image ClassificationCIFAR-10 Global (test)
Accuracy41.98
26
Natural Language UnderstandingGLUE (Others)
Alast Score26.39
20
Natural Language UnderstandingGLUE (Self)
Alast61.88
20
Personalized Federated LearningDRAKE static (Others)
Alast47.08
20
Multi-Label ClassificationCheXpert Local (test)
Dir (1)0.7252
16
Multi-Label ClassificationCheXpert Global (test)
Dir (t=1)74.79
16
Personalized Federated LearningDRAKE Dirichlet non-i.i.d. splits alpha=0.5
A_last67.44
10
Showing 10 of 21 rows

Other info

Follow for update