Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization

About

Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation and early-stopping. We formulate hyperparameter optimization as a pure-exploration non-stochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce a novel algorithm, Hyperband, for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare Hyperband with popular Bayesian optimization methods on a suite of hyperparameter optimization problems. We observe that Hyperband can provide over an order-of-magnitude speedup over our competitor set on a variety of deep-learning and kernel-based learning problems.

Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, Ameet Talwalkar• 2016

Related benchmarks

TaskDatasetResultRank
Image ClassificationCIFAR-10--
471
Hyperparameter OptimizationJAHS-C10 Bench (val)
Validation Error10.077
52
Hyperparameter OptimizationJAHS-Bench CH (val)
Validation Error5.704
31
Hyperparameter OptimizationJAHS-Bench FM (val)
Validation Error5.306
28
Federated Image ClassificationFEMNIST (i.i.d.)
Test Error Rate14.79
8
Image ClassificationFEMNIST (i.i.d.)
Test Error14.79
8
Federated Image ClassificationFEMNIST non-i.i.d.
Test Error14.78
8
Federated Image ClassificationCIFAR-10 (i.i.d.)
Test Error21.62
8
Image ClassificationFEMNIST non-i.i.d.
Test Error Rate14.78
8
Federated Character PredictionShakespeare (i.i.d.)
Test Error46.77
8
Showing 10 of 13 rows

Other info

Follow for update