Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Cross-Architecture Distillation Made Simple with Redundancy Suppression

About

We describe a simple method for cross-architecture knowledge distillation, where the knowledge transfer is cast into a redundant information suppression formulation. Existing methods introduce sophisticated modules, architecture-tailored designs, and excessive parameters, which impair their efficiency and applicability. We propose to extract the architecture-agnostic knowledge in heterogeneous representations by reducing the redundant architecture-exclusive information. To this end, we present a simple redundancy suppression distillation (RSD) loss, which comprises cross-architecture invariance maximisation and feature decorrelation objectives. To prevent the student from entirely losing its architecture-specific capabilities, we further design a lightweight module that decouples the RSD objective from the student's internal representations. Our method is devoid of the architecture-specific designs and complex operations in the pioneering method of OFA. It outperforms OFA on CIFAR-100 and ImageNet-1k benchmarks with only a fraction of their parameter overhead, which highlights its potential as a simple and strong baseline to the cross-architecture distillation community.

Weijia Zhang, Yuehao Liu, Wu Ran, Chao Ma• 2025

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1K
Top-1 Acc72.36
836
Image ClassificationCIFAR100 (test)
Accuracy83.92
206
Image ClassificationImageNet-1K
Top-1 Acc73.08
75
Showing 3 of 3 rows

Other info

Follow for update