Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

$f$-FUM: Federated Unlearning via min--max and $f$-divergence

About

Federated Learning (FL) has emerged as a powerful paradigm for collaborative machine learning across decentralized data sources, preserving privacy by keeping data local. However, increasing legal and ethical demands, such as the "right to be forgotten", and the need to mitigate data poisoning attacks have underscored the urgent necessity for principled data unlearning in FL. Unlike centralized settings, the distributed nature of FL complicates the removal of individual data contributions. In this paper, we propose a novel federated unlearning framework formulated as a min-max optimization problem, where the objective is to maximize an $f$-divergence between the model trained with all data and the model retrained without specific data points, while minimizing the degradation on retained data. Our framework could act like a plugin and be added to almost any federated setup, unlike SOTA methods like (\cite{10269017} which requires model degradation in server, or \cite{khalil2025notfederatedunlearningweight} which requires to involve model architecture and model weights). This formulation allows for efficient approximation of data removal effects in a federated setting. We provide empirical evaluations to show that our method achieves significant speedups over naive retraining, with minimal impact on utility.

Radmehr Karimian, Amirhossein Bagheri, Meghdad Kurmanji, Nicholas D. Lane, Gholamali Aminian• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationFashionMNIST (test)--
218
Federated UnlearningCIFAR-10 (test)
Pretrain Accuracy80
20
Federated UnlearningFashion MNIST (test)
Pre-training Accuracy99.2
20
Federated UnlearningMNIST (test)
Pretrain Accuracy99.13
20
Federated UnlearningCIFAR-10
Pretrain Accuracy68.7
20
Federated UnlearningFashion MNIST
Pre-training Accuracy65.93
20
Image ClassificationCIFAR-10 (test)
Pretrain Accuracy60.44
20
Image ClassificationMNIST (test)
Pretrain Accuracy77.48
15
Federated UnlearningCIFAR-10 (val)
Pretrain Accuracy63.64
12
Federated UnlearningMNIST (val)
Pretrain Acc99.14
12
Showing 10 of 23 rows

Other info

Follow for update