Channel Pruning for Accelerating Very Deep Neural Networks
About
In this paper, we introduce a new channel pruning method to accelerate very deep convolutional neural networks.Given a trained CNN model, we propose an iterative two-step algorithm to effectively prune each layer, by a LASSO regression based channel selection and least square reconstruction. We further generalize this algorithm to multi-layer and multi-branch cases. Our method reduces the accumulated error and enhance the compatibility with various architectures. Our pruned VGG-16 achieves the state-of-the-art results by 5x speed-up along with only 0.3% increase of error. More importantly, our method is able to accelerate modern networks like ResNet, Xception and suffers only 1.4%, 1.0% accuracy loss under 2x speed-up respectively, which is significant. Code has been made publicly available.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-10 (test) | Accuracy93.67 | 3381 | |
| Image Classification | ImageNet-1k (val) | Top-1 Accuracy72.3 | 1453 | |
| Image Classification | ImageNet (val) | Top-1 Acc73.3 | 1206 | |
| Image Classification | CIFAR-10 (test) | Accuracy92.5 | 906 | |
| Image Classification | ImageNet-1k (val) | Top-1 Accuracy73.3 | 840 | |
| Image Classification | ImageNet | Top-1 Accuracy-3.68 | 429 | |
| Image Classification | ImageNet-1k (val) | CR-F41.86 | 57 | |
| Image Classification | ImageNet (val) | T175.06 | 45 | |
| Image Classification | ImageNet-1k (val) | -- | 35 | |
| Image Classification | CIFAR-10 | Delta-Top1-0.47 | 28 |