Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Gradient Starvation: A Learning Proclivity in Neural Networks

About

We identify and formalize a fundamental gradient descent phenomenon resulting in a learning proclivity in over-parameterized neural networks. Gradient Starvation arises when cross-entropy loss is minimized by capturing only a subset of features relevant for the task, despite the presence of other predictive features that fail to be discovered. This work provides a theoretical explanation for the emergence of such feature imbalance in neural networks. Using tools from Dynamical Systems theory, we identify simple properties of learning dynamics during gradient descent that lead to this imbalance, and prove that such a situation can be expected given certain statistical structure in training data. Based on our proposed formalism, we develop guarantees for a novel regularization method aimed at decoupling feature learning dynamics, improving accuracy and robustness in cases hindered by gradient starvation. We illustrate our findings with simple and real-world out-of-distribution (OOD) generalization experiments.

Mohammad Pezeshki, S\'ekou-Oumar Kaba, Yoshua Bengio, Aaron Courville, Doina Precup, Guillaume Lajoie• 2020

Related benchmarks

TaskDatasetResultRank
Semantic segmentationCityscapes
mIoU34.77
578
Domain GeneralizationPACS (test)
Average Accuracy64.3
225
Semantic segmentationBDD100K
mIoU28
78
Semantic segmentationMapillary
mIoU31.41
75
Image ClassificationCMNIST (test)
Test Accuracy70.3
55
Image ClassificationOfficeHome DomainBed suite (test)
Accuracy62.9
45
Domain GeneralizationDomainNet DomainBed (test)
Clipart Accuracy51.3
37
Image ClassificationDomainBed
PACS Accuracy84.4
33
Domain GeneralizationPACS DomainBed (test)--
29
Domain GeneralizationVLCS DomainBed (test)
Average OOD Accuracy75.5
27
Showing 10 of 18 rows

Other info

Follow for update