Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading

About

Automatic disease image grading is a significant application of artificial intelligence for healthcare, enabling faster and more accurate patient assessments. However, domain shifts, which are exacerbated by data imbalance, introduce bias into the model, posing deployment difficulties in clinical applications. To address the problem, we propose a novel \textbf{U}ncertainty-aware \textbf{M}ulti-experts \textbf{K}nowledge \textbf{D}istillation (UMKD) framework to transfer knowledge from multiple expert models to a single student model. Specifically, to extract discriminative features, UMKD decouples task-agnostic and task-specific features with shallow and compact feature alignment in the feature space. At the output space, an uncertainty-aware decoupled distillation (UDD) mechanism dynamically adjusts knowledge transfer weights based on expert model uncertainties, ensuring robust and reliable distillation. Additionally, UMKD also tackles the problems of model architecture heterogeneity and distribution discrepancies between source and target domains, which are inadequately tackled by previous KD approaches. Extensive experiments on histology prostate grading (\textit{SICAPv2}) and fundus image grading (\textit{APTOS}) demonstrate that UMKD achieves a new state-of-the-art in both source-imbalanced and target-imbalanced scenarios, offering a robust and practical solution for real-world disease image grading.

Shuo Tong, Shangde Gao, Ke Liu, Zihang Huang, Hongxia Xu, Haochao Ying, Jian Wu• 2025

Related benchmarks

TaskDatasetResultRank
Prostate cancer gradingSICAP target-imbalanced KD v2 (test)
Overall Accuracy91.75
9
Fundus image gradingAPTOS Target-imbalanced
OA83.91
9
Prostate cancer gradingSICAPv2 source-imbalanced KD (test)
OA0.9102
9
Fundus image gradingAPTOS Sources-imbalanced
OA74.61
9
Showing 4 of 4 rows

Other info

Follow for update