Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces

About

Recent advances have extended the scope of Bayesian optimization (BO) to expensive-to-evaluate black-box functions with dozens of dimensions, aspiring to unlock impactful applications, for example, in the life sciences, neural architecture search, and robotics. However, a closer examination reveals that the state-of-the-art methods for high-dimensional Bayesian optimization (HDBO) suffer from degrading performance as the number of dimensions increases or even risk failure if certain unverifiable assumptions are not met. This paper proposes BAxUS that leverages a novel family of nested random subspaces to adapt the space it optimizes over to the problem. This ensures high performance while removing the risk of failure, which we assert via theoretical guarantees. A comprehensive evaluation demonstrates that BAxUS achieves better results than the state-of-the-art methods for a broad set of applications.

Leonard Papenmeier, Luigi Nardi, Matthias Poloczek• 2023

Related benchmarks

TaskDatasetResultRank
Black-box OptimizationAckley d=20
Median Best Objective Value0.36
9
Black-box OptimizationGriewank d=20
Median Objective Value0.75
9
Black-box OptimizationSphere d=20
Objective Value (Median)1.92
9
Simulator Benchmark OptimizationSwimmer
Median Performance-278.3
9
Simulator Benchmark OptimizationRover Trajectory
Median Performance-1.83
9
Simulator Benchmark OptimizationLunar Lander
Median Performance-301.9
9
Simulator Benchmark OptimizationRobot Pushing
Median Performance-3.86
9
High-dimensional optimizationMOPTA08 124D
Objective Value-240.8
6
High-dimensional optimizationRover 60D
Objective Value0.67
6
High-dimensional optimizationSVM 388D
Objective Value-0.1
6
Showing 10 of 18 rows

Other info

Follow for update