BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization
About
Bayesian optimization provides sample-efficient global optimization for a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. We introduce BoTorch, a modern programming framework for Bayesian optimization that combines Monte-Carlo (MC) acquisition functions, a novel sample average approximation optimization approach, auto-differentiation, and variance reduction techniques. BoTorch's modular design facilitates flexible specification and optimization of probabilistic models written in PyTorch, simplifying implementation of new acquisition functions. Our approach is backed by novel theoretical convergence results and made practical by a distinctive algorithmic foundation that leverages fast predictive distributions, hardware acceleration, and deterministic optimization. We also propose a novel "one-shot" formulation of the Knowledge Gradient, enabled by a combination of our theoretical and software contributions. In experiments, we demonstrate the improved sample efficiency of BoTorch relative to other popular libraries.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Bayesian Optimization | 50 optimization problems COCO, BoTorch, Bayesmark (aggregated) | Mean RP1.4 | 26 | |
| Surrogate Modeling | Hartmann 6D | Normalized RMSE0.63 | 16 | |
| Surrogate Modeling | Hartmann 3D | Normalized RMSE0.27 | 16 | |
| Surrogate Modeling | Park1 4D | Normalized RMSE0.98 | 16 | |
| Surrogate Modeling | Park2 4D | Normalized RMSE0.49 | 16 | |
| Surrogate Modeling | Levy 7D | Normalized RMSE1 | 16 | |
| Surrogate Modeling | Branin 2D | Normalized RMSE0.7 | 16 | |
| Surrogate Modeling | Rosenbrock 10D | Normalized RMSE1 | 16 | |
| Surrogate Modeling | Rastrigin 5D | Normalized RMSE1 | 16 | |
| Surrogate Modeling | Borehole 8D | Normalized RMSE1.02 | 16 |