Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Beyond Student: An Asymmetric Network for Neural Network Inheritance

About

Knowledge Distillation (KD) has emerged as a powerful technique for model compression, enabling lightweight student networks to benefit from the performance of redundant teacher networks. However, the inherent capacity gap often limits the performance of student networks. Inspired by the expressiveness of pretrained teacher networks, a compelling research question arises: is there a type of network that can not only inherit the teacher's structure but also maximize the inheritance of its knowledge? Furthermore, how does the performance of such an inheriting network compare to that of student networks, all benefiting from the same teacher network? To further explore this question, we propose InherNet, a neural network inheritance method that performs asymmetric low-rank decomposition on the teacher's weights and reconstructs a lightweight yet expressive network without significant architectural disruption. By leveraging Singular Value Decomposition (SVD) for initialization to ensure the inheritance of principal knowledge, InherNet effectively balances depth, width, and compression efficiency. Experimental results across unimodal and multimodal tasks demonstrate that InherNet achieves higher performance compared to student networks of similar parameter sizes. Our findings reveal a promising direction for future research in efficient model compression beyond traditional distillation.

Yiyun Zhou, Jingwei Shi, Mingjing Xu, Zhonghua Jiang, Jingyuan Chen• 2026

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE (val)
SST-289.07
170
Math ReasoningGSM8K (test)
Accuracy77.19
155
Text-to-Image RetrievalCC-3M (val)
R@133.01
5
Image-to-Text RetrievalCC3M (val)
R@132.17
5
General Instruction FollowingMMLU style (test)
Accuracy49.82
3
Visual Question AnsweringVQA v2.0
Overall Accuracy57.03
2
Showing 6 of 6 rows

Other info

Follow for update