Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces
About
Recent advances have extended the scope of Bayesian optimization (BO) to expensive-to-evaluate black-box functions with dozens of dimensions, aspiring to unlock impactful applications, for example, in the life sciences, neural architecture search, and robotics. However, a closer examination reveals that the state-of-the-art methods for high-dimensional Bayesian optimization (HDBO) suffer from degrading performance as the number of dimensions increases or even risk failure if certain unverifiable assumptions are not met. This paper proposes BAxUS that leverages a novel family of nested random subspaces to adapt the space it optimizes over to the problem. This ensures high performance while removing the risk of failure, which we assert via theoretical guarantees. A comprehensive evaluation demonstrates that BAxUS achieves better results than the state-of-the-art methods for a broad set of applications.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Black-box Optimization | Ackley d=20 | Median Best Objective Value0.36 | 9 | |
| Black-box Optimization | Griewank d=20 | Median Objective Value0.75 | 9 | |
| Black-box Optimization | Sphere d=20 | Objective Value (Median)1.92 | 9 | |
| Simulator Benchmark Optimization | Swimmer | Median Performance-278.3 | 9 | |
| Simulator Benchmark Optimization | Rover Trajectory | Median Performance-1.83 | 9 | |
| Simulator Benchmark Optimization | Lunar Lander | Median Performance-301.9 | 9 | |
| Simulator Benchmark Optimization | Robot Pushing | Median Performance-3.86 | 9 | |
| High-dimensional optimization | MOPTA08 124D | Objective Value-240.8 | 6 | |
| High-dimensional optimization | Rover 60D | Objective Value0.67 | 6 | |
| High-dimensional optimization | SVM 388D | Objective Value-0.1 | 6 |