Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Patient Knowledge Distillation for BERT Model Compression

About

Pre-trained language models such as BERT have proven to be highly effective for natural language processing (NLP) tasks. However, the high demand for computing resources in training such models hinders their application in practice. In order to alleviate this resource hunger in large-scale model training, we propose a Patient Knowledge Distillation approach to compress an original large model (teacher) into an equally-effective lightweight shallow network (student). Different from previous knowledge distillation methods, which only use the output from the last layer of the teacher network for distillation, our student model patiently learns from multiple intermediate layers of the teacher model for incremental knowledge extraction, following two strategies: ($i$) PKD-Last: learning from the last $k$ layers; and ($ii$) PKD-Skip: learning from every $k$ layers. These two patient distillation schemes enable the exploitation of rich information in the teacher's hidden layers, and encourage the student model to patiently learn from and imitate the teacher through a multi-layer distillation process. Empirically, this translates into improved results on multiple NLP tasks with significant gain in training efficiency, without sacrificing model accuracy.

Siqi Sun, Yu Cheng, Zhe Gan, Jingjing Liu• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)--
3518
Image ClassificationImageNet (val)
Top-1 Acc76.16
1206
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)91.3
504
Natural Language UnderstandingGLUE (test)
SST-2 Accuracy92
416
Question AnsweringSQuAD v1.1 (dev)
F1 Score85.3
375
Question AnsweringSQuAD v2.0 (dev)
F169.8
158
Intent ClassificationBanking77 (test)
Accuracy89.87
151
Question ClassificationTREC (test)
Accuracy96.4
124
Topic ClassificationAG News (test)
Accuracy94.54
98
Natural Language UnderstandingSuperGLUE (dev)
Average Score66.2
91
Showing 10 of 18 rows

Other info

Follow for update