Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Fed-Focal Loss for imbalanced data classification in Federated Learning

About

The Federated Learning setting has a central server coordinating the training of a model on a network of devices. One of the challenges is variable training performance when the dataset has a class imbalance. In this paper, we address this by introducing a new loss function called Fed-Focal Loss. We propose to address the class imbalance by reshaping cross-entropy loss such that it down-weights the loss assigned to well-classified examples along the lines of focal loss. Additionally, by leveraging a tunable sampling framework, we take into account selective client model contributions on the central server to further focus the detector during training and hence improve its robustness. Using a detailed experimental analysis with the VIRTUAL (Variational Federated Multi-Task Learning) approach, we demonstrate consistently superior performance in both the balanced and unbalanced scenarios for MNIST, FEMNIST, VSN and HAR benchmarks. We obtain a more than 9% (absolute percentage) improvement in the unbalanced MNIST benchmark. We further show that our technique can be adopted across multiple Federated Learning algorithms to get improvements.

Dipankar Sarkar, Ankur Narang, Sumit Rai• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationISIC 2019 (test)--
43
Medical Image ClassificationSC (Skin Cancer) (test)
Accuracy54.23
33
Medical Image ClassificationBT (Brain Tumor) (test)
Accuracy67.26
31
Medical Image ClassificationReal (last five rounds average)
BACC44
11
Medical Image ClassificationICH (test)
Balanced Accuracy63.04
11
Medical Image Classificationreal (test)
Accuracy59.97
7
Medical Image ClassificationBT2 (whole dataset)
Global Accuracy0.7999
7
Showing 7 of 7 rows

Other info

Follow for update