When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search
About
The key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space. We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures while also achieving a higher search accuracy. TNAS introduces an architecture tree and a binary operation tree, to factorize the search space and substantially reduce the exploration size. TNAS performs a modified bi-level Breadth-First Search in the proposed trees to discover a high-performance architecture. Impressively, TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37\% in four GPU hours in NAS-Bench-201. The average test accuracy is 94.35\%, which outperforms the state-of-the-art. Code is available at: \url{https://github.com/guochengqian/TNAS}.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Image Classification | CIFAR-10 NAS-Bench-201 (test) | Accuracy94.37 | 173 | |
| Image Classification | CIFAR-100 NAS-Bench-201 (test) | Accuracy73.09 | 169 | |
| Image Classification | ImageNet-16-120 NAS-Bench-201 (test) | Accuracy46.33 | 139 |