Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Channel Pruning for Accelerating Very Deep Neural Networks

About

In this paper, we introduce a new channel pruning method to accelerate very deep convolutional neural networks.Given a trained CNN model, we propose an iterative two-step algorithm to effectively prune each layer, by a LASSO regression based channel selection and least square reconstruction. We further generalize this algorithm to multi-layer and multi-branch cases. Our method reduces the accumulated error and enhance the compatibility with various architectures. Our pruned VGG-16 achieves the state-of-the-art results by 5x speed-up along with only 0.3% increase of error. More importantly, our method is able to accelerate modern networks like ResNet, Xception and suffers only 1.4%, 1.0% accuracy loss under 2x speed-up respectively, which is significant. Code has been made publicly available.

Yihui He, Xiangyu Zhang, Jian Sun• 2017

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy93.67
3381
Image ClassificationImageNet-1k (val)
Top-1 Accuracy72.3
1453
Image ClassificationImageNet (val)
Top-1 Acc73.3
1206
Image ClassificationCIFAR-10 (test)
Accuracy92.5
906
Image ClassificationImageNet-1k (val)
Top-1 Accuracy73.3
840
Image ClassificationImageNet
Top-1 Accuracy-3.68
429
Image ClassificationImageNet-1k (val)
CR-F41.86
57
Image ClassificationImageNet (val)
T175.06
45
Image ClassificationImageNet-1k (val)--
35
Image ClassificationCIFAR-10
Delta-Top1-0.47
28
Showing 10 of 17 rows

Other info

Follow for update