Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

FAdam: Adam is a natural gradient optimizer using diagonal empirical Fisher information

About

This paper establishes a mathematical foundation for the Adam optimizer, elucidating its connection to natural gradient descent through Riemannian and information geometry. We provide an accessible and detailed analysis of the diagonal empirical Fisher information matrix (FIM) in Adam, clarifying all detailed approximations and advocating for the use of log probability functions as loss, which should be based on discrete distributions, due to the limitations of empirical FIM. Our analysis uncovers flaws in the original Adam algorithm, leading to proposed corrections such as enhanced momentum calculations, adjusted bias corrections, adaptive epsilon, and gradient clipping. We refine the weight decay term based on our theoretical framework. Our modified algorithm, Fisher Adam (FAdam), demonstrates superior performance across diverse domains including LLM, ASR, and VQ-VAE, achieving state-of-the-art results in ASR.

Dongseong Hwang• 2024

Related benchmarks

TaskDatasetResultRank
Automatic Speech RecognitionLibriSpeech (test-other)
WER2.49
966
Automatic Speech RecognitionLibriSpeech clean (test)
WER1.34
833
Automatic Speech RecognitionLibriSpeech (dev-other)
WER2.49
411
Speech RecognitionLibriSpeech (test)
WER0.0134
59
Speech RecognitionLibriSpeech (dev)
WER1.29
21
Showing 5 of 5 rows

Other info

Code

Follow for update