Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Loss Gap Parity for Fairness in Heterogeneous Federated Learning

About

While clients may join federated learning to improve performance on data they rarely observe locally, they often remain self-interested, expecting the global model to perform well on their own data. This motivates an objective that ensures all clients achieve a similar loss gap -the difference in performance between the global model and the best model they could train using only their local data-. To this end, we propose EAGLE, a novel federated learning algorithm that explicitly regularizes the global model to minimize disparities in loss gaps across clients. Our approach is particularly effective in heterogeneous settings, where the optimal local models of the clients may be misaligned. Unlike existing methods that encourage loss parity, potentially degrading performance for many clients, EAGLE targets fairness in relative improvements. We provide theoretical convergence guarantees for EAGLE under non-convex loss functions, and characterize how its iterates perform relative to the standard federated learning objective using a novel heterogeneity measure. Empirically, we demonstrate that EAGLE reduces the disparity in loss gaps among clients by prioritizing those furthest from their local optimal loss, while maintaining competitive utility in both convex and non-convex cases compared to strong baselines.

Brahim Erraji, Micha\"el Perrot, Aur\'elien Bellet• 2026

Related benchmarks

TaskDatasetResultRank
Image ClassificationEMNIST Dir(0.1) (test)
Test Accuracy69.1
28
Federated Learning Image ClassificationDirtyMNIST
Max r_k(θ)0.434
12
Federated Learning ClassificationEMNIST dir(alpha=0.1) (test)
Max r_k(theta)0.063
10
Image ClassificationEMNIST alpha = 0.1
Max r_k(theta)0.203
10
Showing 4 of 4 rows

Other info

Follow for update