Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

DrNAS: Dirichlet Neural Architecture Search

About

This paper proposes a novel differentiable architecture search method by formulating it into a distribution learning problem. We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution. With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based optimizer in an end-to-end manner. This formulation improves the generalization ability and induces stochasticity that naturally encourages exploration in the search space. Furthermore, to alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme that enables searching directly on large-scale tasks, eliminating the gap between search and evaluation phases. Extensive experiments demonstrate the effectiveness of our method. Specifically, we obtain a test error of 2.46% for CIFAR-10, 23.7% for ImageNet under the mobile setting. On NAS-Bench-201, we also achieve state-of-the-art results on all three datasets and provide insights for the effective design of neural architecture search algorithms.

Xiangning Chen, Ruochen Wang, Minhao Cheng, Xiaocheng Tang, Cho-Jui Hsieh• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)--
3518
Image ClassificationCIFAR-10 (test)--
3381
Image ClassificationCIFAR-10 (test)
Accuracy (Clean)94.18
273
Image ClassificationImageNet (test)
Top-1 Acc75.8
235
Image ClassificationImageNet (val)--
188
Image ClassificationCIFAR-10 NAS-Bench-201 (test)
Accuracy94.36
173
Image ClassificationCIFAR-100 NAS-Bench-201 (test)
Accuracy73.51
169
Image ClassificationImageNet Mobile Setting (test)
Top-1 Error23.7
165
Image ClassificationCIFAR-10 (test)
Test Error Rate5.64
151
Image ClassificationImageNet-16-120 NAS-Bench-201 (test)
Accuracy46.34
139
Showing 10 of 30 rows

Other info

Code

Follow for update