Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

SGDR: Stochastic Gradient Descent with Warm Restarts

About

Restart techniques are common in gradient-free optimization to deal with multimodal functions. Partial warm restarts are also gaining popularity in gradient-based optimization to improve the rate of convergence in accelerated gradient schemes to deal with ill-conditioned functions. In this paper, we propose a simple warm restart technique for stochastic gradient descent to improve its anytime performance when training deep neural networks. We empirically study its performance on the CIFAR-10 and CIFAR-100 datasets, where we demonstrate new state-of-the-art results at 3.14% and 16.21%, respectively. We also demonstrate its advantages on a dataset of EEG recordings and on a downsampled version of the ImageNet dataset. Our source code is available at https://github.com/loshchil/SGDR

Ilya Loshchilov, Frank Hutter• 2016

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 (test)
Accuracy95.23
3381
Machine TranslationWMT En-De 2014 (test)
BLEU27.35
379
Question AnsweringSQuAD v1.1 (test)
F1 Score88.61
260
Image ClassificationImageNet (test)--
235
Machine TranslationIWSLT De-En 2014 (test)
BLEU35.21
146
Machine TranslationWMT En-De '14
BLEU27.35
89
Machine TranslationIWSLT14 DE-EN
BLEU Score35.21
22
Machine TranslationIWSLT DE-EN '14 (train)
Training Perplexity3.08
12
Machine TranslationIWSLT'14 DE-EN (val)
Validation PPL4.88
12
Machine TranslationWMT En-De 2014 (test)
BLEU Score26.95
10
Showing 10 of 13 rows

Other info

Follow for update