Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

TARGET: Federated Class-Continual Learning via Exemplar-Free Distillation

About

This paper focuses on an under-explored yet important problem: Federated Class-Continual Learning (FCCL), where new classes are dynamically added in federated learning. Existing FCCL works suffer from various limitations, such as requiring additional datasets or storing the private data from previous tasks. In response, we first demonstrate that non-IID data exacerbates catastrophic forgetting issue in FL. Then we propose a novel method called TARGET (federat\textbf{T}ed cl\textbf{A}ss-continual lea\textbf{R}nin\textbf{G} via \textbf{E}xemplar-free dis\textbf{T}illation), which alleviates catastrophic forgetting in FCCL while preserving client data privacy. Our proposed method leverages the previously trained global model to transfer knowledge of old tasks to the current task at the model level. Moreover, a generator is trained to produce synthetic data to simulate the global distribution of data on each client at the data level. Compared to previous FCCL methods, TARGET does not require any additional datasets or storing real data from previous tasks, which makes it ideal for data-sensitive scenarios.

Jie Zhang, Chen Chen, Weiming Zhuang, Lingjuan Lv• 2023

Related benchmarks

TaskDatasetResultRank
Federated Class-Incremental LearningTiny-ImageNet 10 tasks (20 classes per task) (test)
FAA72.6
54
Federated Class-Incremental LearningCIFAR-100 Quantity-based label imbalance
FAA60.9
42
Federated Class-Incremental LearningCIFAR-100 Distribution-based label imbalance
FAA66.1
39
Federated Class-Incremental LearningImageNet-R
FAA (β=0.5)54.65
13
Federated Class-Incremental LearningCIFAR-100
FAA (beta=0.5)74.72
13
Federated Class-Incremental LearningCIFAR-100 alpha = 6
FAA60.9
5
Showing 6 of 6 rows

Other info

Follow for update