Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Federated Multi-Task Learning

About

Federated learning poses new statistical and systems challenges in training machine learning models over distributed networks of devices. In this work, we show that multi-task learning is naturally suited to handle the statistical challenges of this setting, and propose a novel systems-aware optimization method, MOCHA, that is robust to practical systems issues. Our method and theory for the first time consider issues of high communication cost, stragglers, and fault tolerance for distributed multi-task learning. The resulting method achieves significant speedups compared to alternatives in the federated setting, as we demonstrate through simulations on real-world federated datasets.

Virginia Smith, Chao-Kai Chiang, Maziar Sanjabi, Ameet Talwalkar• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 Global (test)
Accuracy29.46
26
Medical Image ClassificationKvasir
Accuracy92.46
24
ClassificationSynthetic (test)
Accuracy73.4
22
Image ClassificationCIFAR-100 Pathological
Mean Accuracy65.33
18
Image ClassificationCIFAR-100 Practical
Mean Accuracy46.28
18
Image ClassificationCIFAR-10 Practical
Mean Accuracy85.92
18
Multi-Label ClassificationCheXpert Local (test)
Dir (1)0.6518
16
Multi-Label ClassificationCheXpert Global (test)
Dir (t=1)65.67
16
Medical Image ClassificationFedISIC
Average Accuracy69.2
10
Image ClassificationFEMNIST Practical
Mean Accuracy100
9
Showing 10 of 12 rows

Other info

Follow for update