Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Convolutional Neural Network Pruning with Structural Redundancy Reduction

About

Convolutional neural network (CNN) pruning has become one of the most successful network compression approaches in recent years. Existing works on network pruning usually focus on removing the least important filters in the network to achieve compact architectures. In this study, we claim that identifying structural redundancy plays a more essential role than finding unimportant filters, theoretically and empirically. We first statistically model the network pruning problem in a redundancy reduction perspective and find that pruning in the layer(s) with the most structural redundancy outperforms pruning the least important filters across all layers. Based on this finding, we then propose a network pruning approach that identifies structural redundancy of a CNN and prunes filters in the selected layer(s) with the most redundancy. Experiments on various benchmark network architectures and datasets show that our proposed approach significantly outperforms the previous state-of-the-art.

Zi Wang, Chengcheng Li, Xiangyang Wang• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet-1k (val)--
1453
Few-shot Image GenerationSunglasses 10-shot
FID55.97
36
Few-shot Image GenerationBabies 10-shot
FID101.6
35
Few-shot Image GenerationAFHQ-Cat 10-shot
FID64.68
34
Few-shot Image GenerationAFHQ-Dog 10-shot
FID151.5
34
Few-shot Image GenerationAFHQ-Wild 10-shot
FID81.3
34
Few-shot Image GenerationMetFaces 10-shot
FID76.81
34
Few-shot Image GenerationSketches 10-shot
FID53.42
18
Few-shot Image GenerationSketches
intra-LPIPS0.386
11
Few-shot Image GenerationBabies
intra-LPIPS0.517
11
Showing 10 of 11 rows

Other info

Follow for update