Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Distilling foundation models for robust and efficient models in digital pathology

About

In recent years, the advent of foundation models (FM) for digital pathology has relied heavily on scaling the pre-training datasets and the model size, yielding large and powerful models. While it resulted in improving the performance on diverse downstream tasks, it also introduced increased computational cost and inference time. In this work, we explore the distillation of a large foundation model into a smaller one, reducing the number of parameters by several orders of magnitude. Leveraging distillation techniques, our distilled model, H0-mini, achieves nearly comparable performance to large FMs at a significantly reduced inference cost. It is evaluated on several public benchmarks, achieving 3rd place on the HEST benchmark and 5th place on the EVA benchmark. Additionally, a robustness analysis conducted on the PLISM dataset demonstrates that our distilled model reaches excellent robustness to variations in staining and scanning conditions, significantly outperforming other state-of-the art models. This opens new perspectives to design lightweight and robust models for digital pathology, without compromising on performance.

Alexandre Filiot, Nicolas Dop, Oussama Tchita, Auriane Riou, R\'emy Dubois, Thomas Peeters, Daria Valter, Marin Scalbert, Charlie Saillard, Genevi\`eve Robin, Antoine Olivier• 2025

Related benchmarks

TaskDatasetResultRank
Slide-level classificationCamelyon16--
52
Slide-level classificationPanda
Balanced Accuracy66.7
11
Patch-Level ClassificationCRC
Balanced Accuracy96.1
11
Spatial Transcriptomics PredictionHEST (test)
Idc0.5909
11
Patch-Level ClassificationPCAM
Balanced Accuracy94.2
11
Multi-task performance evaluationEVA
Mean All78.2
11
Patch-Level ClassificationBACH
Balanced Acc77.4
11
SegmentationMonusac
MonaiDiceScore64.3
11
SegmentationCoNSeP
Dice Score62.9
11
Patch-Level ClassificationMHIST
Balanced Acc79
11
Showing 10 of 11 rows

Other info

Code

Follow for update