Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

HEBO Pushing The Limits of Sample-Efficient Hyperparameter Optimisation

About

In this work we rigorously analyse assumptions inherent to black-box optimisation hyper-parameter tuning tasks. Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers. Based on these findings, we propose a Heteroscedastic and Evolutionary Bayesian Optimisation solver (HEBO). HEBO performs non-linear input and output warping, admits exact marginal log-likelihood optimisation and is robust to the values of learned parameters. We demonstrate HEBO's empirical efficacy on the NeurIPS 2020 Black-Box Optimisation challenge, where HEBO placed first. Upon further analysis, we observe that HEBO significantly outperforms existing black-box optimisers on 108 machine learning hyperparameter tuning tasks comprising the Bayesmark benchmark. Our findings indicate that the majority of hyper-parameter tuning tasks exhibit heteroscedasticity and non-stationarity, multi-objective acquisition ensembles with Pareto front solutions improve queried configurations, and robust acquisition maximisers afford empirical advantages relative to their non-robust counterparts. We hope these findings may serve as guiding principles for practitioners of Bayesian optimisation. All code is made available at https://github.com/huawei-noah/HEBO.

Alexander I. Cowen-Rivers, Wenlong Lyu, Rasul Tutunov, Zhi Wang, Antoine Grosnit, Ryan Rhys Griffiths, Alexandre Max Maraval, Hao Jianye, Jun Wang, Jan Peters, Haitham Bou Ammar• 2020

Related benchmarks

TaskDatasetResultRank
Model and Hyperparameter SelectionKaggle Allstate Private (test)
p-rank64.6
12
Hyperparameter Optimization108 black-box functions (test)
Mean100.1
10
Stochastic Lipschitz OptimizationBranin
Simple Regret0.016
10
Stochastic Lipschitz OptimizationRosenbrock
Simple Regret0.037
10
Stochastic Lipschitz OptimizationSVM
Simple Regret0.031
10
Hyperparameter Optimization108 hyperparameter tuning tasks (summary)
Number of Best Tasks71
9
Stochastic Lipschitz OptimizationAckley
Simple Regret0.094
9
Stochastic Lipschitz OptimizationLevy
Simple Regret0.041
9
Stochastic Lipschitz OptimizationNeedle
Simple Regret0.019
9
Stochastic Lipschitz OptimizationHartmann
Simple Regret (x10^-2)0.033
9
Showing 10 of 57 rows

Other info

Code

Follow for update