Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Knowledge Distillation for Image Restoration : Simultaneous Learning from Degraded and Clean Images

About

Model compression through knowledge distillation has seen extensive application in classification and segmentation tasks. However, its potential in image-to-image translation, particularly in image restoration, remains underexplored. To address this gap, we propose a Simultaneous Learning Knowledge Distillation (SLKD) framework tailored for model compression in image restoration tasks. SLKD employs a dual-teacher, single-student architecture with two distinct learning strategies: Degradation Removal Learning (DRL) and Image Reconstruction Learning (IRL), simultaneously. In DRL, the student encoder learns from Teacher A to focus on removing degradation factors, guided by a novel BRISQUE extractor. In IRL, the student decoder learns from Teacher B to reconstruct clean images, with the assistance of a proposed PIQE extractor. These strategies enable the student to learn from degraded and clean images simultaneously, ensuring high-quality compression of image restoration models. Experimental results across five datasets and three tasks demonstrate that SLKD achieves substantial reductions in FLOPs and parameters, exceeding 80\%, while maintaining strong image restoration performance.

Yongheng Zhang, Danfeng Yan• 2025

Related benchmarks

TaskDatasetResultRank
Image DehazingSOTS
PSNR28.09
141
Image DerainingRain100H
PSNR28.41
15
Image DenoisingSSID
PSNR31.72
8
Showing 3 of 3 rows

Other info

Follow for update