Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

BERT-of-Theseus: Compressing BERT by Progressive Module Replacing

About

In this paper, we propose a novel model compression approach to effectively compress BERT by progressive module replacing. Our approach first divides the original BERT into several modules and builds their compact substitutes. Then, we randomly replace the original modules with their substitutes to train the compact modules to mimic the behavior of the original modules. We progressively increase the probability of replacement through the training. In this way, our approach brings a deeper level of interaction between the original and compact models. Compared to the previous knowledge distillation approaches for BERT compression, our approach does not introduce any additional loss function. Our approach outperforms existing knowledge distillation approaches on GLUE benchmark, showing a new perspective of model compression.

Canwen Xu, Wangchunshu Zhou, Tao Ge, Furu Wei, Ming Zhou• 2020

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)91.5
504
Natural Language UnderstandingGLUE
SST-291.5
452
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy92.2
416
Natural Language UnderstandingSuperGLUE (dev)
Average Score66.1
91
Natural Language UnderstandingGLUE v1 (dev)
MRPC Score89
30
Natural Language UnderstandingGLUE
CoLA Score44.7
15
Natural Language ProcessingGLUE
Red Score53
8
Showing 7 of 7 rows

Other info

Follow for update