Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FedDIP: Federated Learning with Extreme Dynamic Pruning and Incremental Regularization

About

Federated Learning (FL) has been successfully adopted for distributed training and inference of large-scale Deep Neural Networks (DNNs). However, DNNs are characterized by an extremely large number of parameters, thus, yielding significant challenges in exchanging these parameters among distributed nodes and managing the memory. Although recent DNN compression methods (e.g., sparsification, pruning) tackle such challenges, they do not holistically consider an adaptively controlled reduction of parameter exchange while maintaining high accuracy levels. We, therefore, contribute with a novel FL framework (coined FedDIP), which combines (i) dynamic model pruning with error feedback to eliminate redundant information exchange, which contributes to significant performance improvement, with (ii) incremental regularization that can achieve \textit{extreme} sparsity of models. We provide convergence analysis of FedDIP and report on a comprehensive performance and comparative assessment against state-of-the-art methods using benchmark data sets and DNN models. Our results showcase that FedDIP not only controls the model sparsity but efficiently achieves similar or better performance compared to other model pruning methods adopting incremental regularization during distributed model training. The code is available at: https://github.com/EricLoong/feddip.

Qianyu Long, Christos Anagnostopoulos, Shameem Puthiya Parambath, Daning Bi• 2023

Related benchmarks

TaskDatasetResultRank
Person Re-IdentificationVIPeR
Rank-160.76
192
Person Re-IdentificationiLIDS-VID
CMC-180.61
84
Person Re-Identification3DPeS
Rank-185.77
10
Person Re-IdentificationPRID 2011
Rank-1 Accuracy77
10
Showing 4 of 4 rows

Other info

Follow for update