Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization

About

Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. We introduce BoTorch, a modern programming framework for Bayesian optimization that combines Monte-Carlo (MC) acquisition functions, a novel sample average approximation optimization approach, auto-differentiation, and variance reduction techniques. BoTorch's modular design facilitates flexible specification and optimization of probabilistic models written in PyTorch, simplifying implementation of new acquisition functions. Our approach is backed by novel theoretical convergence results and made practical by a distinctive algorithmic foundation that leverages fast predictive distributions, hardware acceleration, and deterministic optimization. We also propose a novel "one-shot" formulation of the Knowledge Gradient, enabled by a combination of our theoretical and software contributions. In experiments, we demonstrate the improved sample efficiency of BoTorch relative to other popular libraries.

Maximilian Balandat, Brian Karrer, Daniel R. Jiang, Samuel Daulton, Benjamin Letham, Andrew Gordon Wilson, Eytan Bakshy• 2019

Related benchmarks

TaskDatasetResultRank
Bayesian Optimization50 optimization problems COCO, BoTorch, Bayesmark (aggregated)
Mean RP1.4
26
Surrogate ModelingHartmann 6D
Normalized RMSE0.63
16
Surrogate ModelingHartmann 3D
Normalized RMSE0.27
16
Surrogate ModelingPark1 4D
Normalized RMSE0.98
16
Surrogate ModelingPark2 4D
Normalized RMSE0.49
16
Surrogate ModelingLevy 7D
Normalized RMSE1
16
Surrogate ModelingBranin 2D
Normalized RMSE0.7
16
Surrogate ModelingRosenbrock 10D
Normalized RMSE1
16
Surrogate ModelingRastrigin 5D
Normalized RMSE1
16
Surrogate ModelingBorehole 8D
Normalized RMSE1.02
16
Showing 10 of 11 rows

Other info

Follow for update