Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Searching for A Robust Neural Architecture in Four GPU Hours

About

Conventional neural architecture search (NAS) approaches are based on reinforcement learning or evolutionary strategy, which take more than 3000 GPU hours to find a good model on CIFAR-10. We propose an efficient NAS approach learning to search by gradient descent. Our approach represents the search space as a directed acyclic graph (DAG). This DAG contains billions of sub-graphs, each of which indicates a kind of neural architecture. To avoid traversing all the possibilities of the sub-graphs, we develop a differentiable sampler over the DAG. This sampler is learnable and optimized by the validation loss after training the sampled architecture. In this way, our approach can be trained in an end-to-end fashion by gradient descent, named Gradient-based search using Differentiable Architecture Sampler (GDAS). In experiments, we can finish one searching procedure in four GPU hours on CIFAR-10, and the discovered model obtains a test error of 2.82\% with only 2.5M parameters, which is on par with the state-of-the-art. Code is publicly available on GitHub: https://github.com/D-X-Y/NAS-Projects.

Xuanyi Dong, Yi Yang• 2019

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100 (test)
Accuracy81.62
3518
Image ClassificationCIFAR-10 (test)
Accuracy97.07
3381
Language ModelingWikiText-2 (test)
PPL69.4
1541
Image ClassificationImageNet-1k (val)
Top-1 Accuracy74
1453
Person Re-IdentificationMarket1501 (test)
Rank-1 Accuracy89.1
1264
Image ClassificationCIFAR-10 (test)
Accuracy97.07
906
Image ClassificationImageNet 1k (test)--
798
Image ClassificationCIFAR-100 (val)
Accuracy71.34
661
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy97.11
471
Showing 10 of 55 rows

Other info

Code

Follow for update