Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Better Generative Replay for Continual Federated Learning

About

Federated learning is a technique that enables a centralized server to learn from distributed clients via communications without accessing the client local data. However, existing federated learning works mainly focus on a single task scenario with static data. In this paper, we introduce the problem of continual federated learning, where clients incrementally learn new tasks and history data cannot be stored due to certain reasons, such as limited storage and data retention policy. Generative replay based methods are effective for continual learning without storing history data, but adapting them for this setting is challenging. By analyzing the behaviors of clients during training, we find that the unstable training process caused by distributed training on non-IID data leads to a notable performance degradation. To address this problem, we propose our FedCIL model with two simple but effective solutions: model consolidation and consistency enforcement. Our experimental results on multiple benchmark datasets demonstrate that our method significantly outperforms baselines.

Daiqing Qi, Handong Zhao, Sheng Li• 2023

Related benchmarks

TaskDatasetResultRank
Federated Domain-Incremental LearningDigit-10
ACC50.95
7
Federated Domain-Incremental LearningVLCS
ACC48.84
7
Federated Domain-Incremental LearningPACS
ACC (%)35.11
7
Federated Domain-Incremental LearningDN4IL
ACC14.29
7
Showing 4 of 4 rows

Other info

Follow for update