Stochastic Pooling for Regularization of Deep Convolutional Neural Networks
About
We introduce a simple and effective method for regularizing large convolutional neural networks. We replace the conventional deterministic pooling operations with a stochastic procedure, randomly picking the activation within each pooling region according to a multinomial distribution, given by the activities within the pooling region. The approach is hyper-parameter free and can be combined with other regularization approaches, such as dropout and data augmentation. We achieve state-of-the-art performance on four image datasets, relative to other approaches that do not utilize data augmentation.
Matthew D. Zeiler, Rob Fergus• 2013
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-100 (test) | -- | 3518 | |
| Image Classification | CIFAR-10 (test) | -- | 3381 | |
| Image Classification | CIFAR-10 (test) | Accuracy84.87 | 906 | |
| Image Classification | MNIST (test) | -- | 882 | |
| Image Classification | CIFAR-10 | Accuracy84.87 | 471 | |
| Image Classification | SVHN (test) | -- | 362 | |
| Classification | SVHN (test) | Error Rate2.8 | 182 | |
| Image Classification | MNIST standard (test) | Error Rate0.47 | 40 | |
| Image Classification | MNIST (train) | Train Error Rate33 | 37 | |
| Image Classification | CIFAR-10 (train) | Error Rate3.4 | 35 |
Showing 10 of 14 rows