Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

EZNAS: Evolving Zero Cost Proxies For Neural Architecture Scoring

About

Neural Architecture Search (NAS) has significantly improved productivity in the design and deployment of neural networks (NN). As NAS typically evaluates multiple models by training them partially or completely, the improved productivity comes at the cost of significant carbon footprint. To alleviate this expensive training routine, zero-shot/cost proxies analyze an NN at initialization to generate a score, which correlates highly with its true accuracy. Zero-cost proxies are currently designed by experts conducting multiple cycles of empirical testing on possible algorithms, datasets, and neural architecture design spaces. This experimentation lowers productivity and is an unsustainable approach towards zero-cost proxy design as deep learning use-cases diversify in nature. Additionally, existing zero-cost proxies fail to generalize across neural architecture design spaces. In this paper, we propose a genetic programming framework to automate the discovery of zero-cost proxies for neural architecture scoring. Our methodology efficiently discovers an interpretable and generalizable zero-cost proxy that gives state of the art score-accuracy correlation on all datasets and search spaces of NASBench-201 and Network Design Spaces (NDS). We believe that this research indicates a promising direction towards automatically discovering zero-cost proxies that can work across network architecture design spaces, datasets, and tasks.

Yash Akhauri, J. Pablo Munoz, Nilesh Jain, Ravi Iyer• 2022

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10 NAS-Bench-201 (test)
Accuracy93.63
173
Image ClassificationCIFAR-100 NAS-Bench-201 (test)
Accuracy69.82
169
Image ClassificationImageNet-16-120 NAS-Bench-201 (test)
Accuracy43.47
139
Architecture Performance PredictionNATS-Bench SSS CIFAR-10 (stratified subset)
mAKCC82.8
24
Architecture Performance PredictionNATS-Bench SSS ImageNet16-120 (stratified subset)
Mean Abs Kendall Corr Coeff70.7
24
Architecture Performance PredictionNATS-Bench SSS CIFAR-100 (stratified subset)
mAKCC64.4
24
Neural Architecture SearchNAS-Bench-201 CIFAR-10
Spearman Correlation83
13
Neural Architecture SearchNAS-Bench-201 CIFAR-100
Spearman Correlation0.82
13
Neural Architecture SearchNDS ENAS
Spearman Correlation0.63
13
Neural Architecture SearchNAS-Bench-201 ImageNet-16-120
Spearman's Tau78
13
Showing 10 of 41 rows

Other info

Follow for update