Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Fine-Tuning DARTS for Image Classification

About

Neural Architecture Search (NAS) has gained attraction due to superior classification performance. Differential Architecture Search (DARTS) is a computationally light method. To limit computational resources DARTS makes numerous approximations. These approximations result in inferior performance. We propose to fine-tune DARTS using fixed operations as they are independent of these approximations. Our method offers a good trade-off between the number of parameters and classification accuracy. Our approach improves the top-1 accuracy on Fashion-MNIST, CompCars, and MIO-TCD datasets by 0.56%, 0.50%, and 0.39%, respectively compared to the state-of-the-art approaches. Our approach performs better than DARTS, improving the accuracy by 0.28%, 1.64%, 0.34%, 4.5%, and 3.27% compared to DARTS, on CIFAR-10, CIFAR-100, Fashion-MNIST, CompCars, and MIO-TCD datasets, respectively.

Muhammad Suhaib Tanveer, Muhammad Umar Karim Khan, Chong-Min Kyung• 2020

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-100--
622
Image ClassificationCIFAR-10
Accuracy97.52
471
Image ClassificationFashion MNIST
Accuracy96.91
225
Image ClassificationFashionMNIST
Accuracy96.91
147
Image ClassificationCompCars Web (test)
Top-1 Acc95.9
33
Image ClassificationCompCars Surveillance (test)
Top-1 Acc99.2
14
Fine-Grained Vehicle RecognitionCompCars
Accuracy95.9
11
Image ClassificationFashion MNIST 66 (test)
Accuracy96.91
8
Image ClassificationMIO-TCD (test)
Top-1 Acc98.34
8
Showing 9 of 9 rows

Other info

Follow for update