Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Generic Neural Architecture Search via Regression

About

Most existing neural architecture search (NAS) algorithms are dedicated to and evaluated by the downstream tasks, e.g., image classification in computer vision. However, extensive experiments have shown that, prominent neural architectures, such as ResNet in computer vision and LSTM in natural language processing, are generally good at extracting patterns from the input data and perform well on different downstream tasks. In this paper, we attempt to answer two fundamental questions related to NAS. (1) Is it necessary to use the performance of specific downstream tasks to evaluate and search for good neural architectures? (2) Can we perform NAS effectively and efficiently while being agnostic to the downstream tasks? To answer these questions, we propose a novel and generic NAS framework, termed Generic NAS (GenNAS). GenNAS does not use task-specific labels but instead adopts regression on a set of manually designed synthetic signal bases for architecture evaluation. Such a self-supervised regression task can effectively evaluate the intrinsic power of an architecture to capture and transform the input signal patterns, and allow more sufficient usage of training samples. Extensive experiments across 13 CNN search spaces and one NLP space demonstrate the remarkable efficiency of GenNAS using regression, in terms of both evaluating the neural architectures (quantified by the ranking correlation Spearman's rho between the approximated performances and the downstream task performances) and the convergence speed for training (within a few seconds).

Yuhong Li, Cong Hao, Pan Li, Jinjun Xiong, Deming Chen• 2021

Related benchmarks

TaskDatasetResultRank
Image ClassificationImageNet Mobile Setting (test)
Top-1 Error24.3
165
Neural Architecture SearchNAS-Bench-201 ImageNet-16-120 (test)--
86
Neural Architecture SearchNASBench-201 CIFAR-10
Retrieving Rate @ Top 10%71
85
Neural Architecture SearchNAS-Bench-201 CIFAR-10 (test)--
85
Neural Architecture SearchNAS-Bench-201 CIFAR-100 (test)--
78
Neural Architecture SearchNeural Design Spaces ImageNet
Retrieval Rate @ Top 10%75
56
Neural Architecture SearchNAS-Bench-101 CIFAR-10 (test)--
18
Neural Architecture SearchNASBench-201 CIFAR-100
Retrieving Rate @ Top 10%58
8
Neural Architecture SearchNASBench-201 ImgNet16
Retrieving Rate @ Top 10%51
8
Neural Architecture Search Ranking CorrelationNASBench-201 CIFAR-10
Kendall's Tau0.71
8
Showing 10 of 23 rows

Other info

Code

Follow for update