Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis

About

Federated learning is widely used in medical applications for training global models without needing local data access. However, varying computational capabilities and network architectures (system heterogeneity), across clients pose significant challenges in effectively aggregating information from non-independently and identically distributed (non-IID) data. Current federated learning methods using knowledge distillation require public datasets, raising privacy and data collection issues. Additionally, these datasets require additional local computing and storage resources, which is a burden for medical institutions with limited hardware conditions. In this paper, we introduce a novel federated learning paradigm, named Model Heterogeneous personalized Federated Learning via Injection and Distillation (MH-pFLID). Our framework leverages a lightweight messenger model that carries concentrated information to collect the information from each client. We also develop a set of receiver and transmitter modules to receive and send information from the messenger model, so that the information could be injected and distilled with efficiency.

Luyuan Xie, Manqing Lin, Tianyu Luan, Cong Li, Yuejian Fang, Qingni Shen, Zhonghai Wu• 2024

Related benchmarks

TaskDatasetResultRank
Optic Disc/Cup Segmentation7 datasets Optic Disc/Cup (test)
Client 1 Score0.8995
22
Prostate SegmentationProstate MRI 6 institutions (test)
Client 1 Score76.43
21
Polyp SegmentationPolyp Segmentation (test)
Client 1 Score68.94
19
Showing 3 of 3 rows

Other info

Follow for update