Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

FedPara: Low-Rank Hadamard Product for Communication-Efficient Federated Learning

About

In this work, we propose a communication-efficient parameterization, FedPara, for federated learning (FL) to overcome the burdens on frequent model uploads and downloads. Our method re-parameterizes weight parameters of layers using low-rank weights followed by the Hadamard product. Compared to the conventional low-rank parameterization, our FedPara method is not restricted to low-rank constraints, and thereby it has a far larger capacity. This property enables to achieve comparable performance while requiring 3 to 10 times lower communication costs than the model with the original layers, which is not achievable by the traditional low-rank methods. The efficiency of our method can be further improved by combining with other efficient FL optimizers. In addition, we extend our method to a personalized FL application, pFedPara, which separates parameters into global and local ones. We show that pFedPara outperforms competing personalized FL methods with more than three times fewer parameters.

Nam Hyeon-Woo, Moon Ye-Bin, Tae-Hyun Oh• 2021

Related benchmarks

TaskDatasetResultRank
Mathematical ReasoningGSM8K (test)
Accuracy19.79
751
Question AnsweringSQuAD 2.0
F185.01
190
SummarizationXsum
ROUGE-218.08
108
Question AnsweringSQuAD v1.1
F188.02
79
SummarizationCNN Daily Mail
ROUGE-139.98
67
Natural Language UnderstandingGLUE (test val)
MRPC Accuracy88.85
59
Mathematical ReasoningMathQA (test)
Accuracy19.46
33
Question AnsweringARC (25-shot), MMLU (5-shot), HellaSwag (10-shot), TruthfulQA (0-shot), and WinoGrande (0-shot) (test)
ARC Accuracy49.06
32
Mathematical ReasoningMetaMathQA (test)
Accuracy30.17
26
MRI Image GenerationADNI (evaluation)
FID12.502
12
Showing 10 of 14 rows

Other info

Follow for update