Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Adam: A Method for Stochastic Optimization

About

We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.

Diederik P. Kingma, Jimmy Ba• 2014

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy78.26
3518
Image ClassificationCIFAR-10 (test)
Accuracy94.55
3381
Language ModelingWikiText-2 (test)
PPL12.633
1541
Image ClassificationImageNet (val)--
1206
Automatic Speech RecognitionLibriSpeech (test-other)
WER2.59
966
Node ClassificationCora
Accuracy78.42
885
Node ClassificationCiteseer (test)
Accuracy0.717
729
Node ClassificationCora (test)
Mean Accuracy84.72
687
Image ClassificationCIFAR-100 (val)
Accuracy76.88
661
Image ClassificationCIFAR10 (test)
Accuracy94.12
585
Showing 10 of 173 rows
...

Other info

Follow for update