The reparameterization trick for acquisition functions
About
Bayesian optimization is a sample-efficient approach to solving global optimization problems. Along with a surrogate model, this approach relies on theoretically motivated value heuristics (acquisition functions) to guide the search process. Maximizing acquisition functions yields the best performance; unfortunately, this ideal is difficult to achieve since optimizing acquisition functions per se is frequently non-trivial. This statement is especially true in the parallel setting, where acquisition functions are routinely non-convex, high-dimensional, and intractable. Here, we demonstrate how many popular acquisition functions can be formulated as Gaussian integrals amenable to the reparameterization trick and, ensuingly, gradient-based optimization. Further, we use this reparameterized representation to derive an efficient Monte Carlo estimator for the upper confidence bound acquisition function in the context of parallel selection.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Neural Architecture Search | NAS | Median Normalized Score0.544 | 16 | |
| Offline Model-Based Optimization | D'Kitty Morphology (test) | Median Normalized Score0.883 | 16 | |
| Offline Model-Based Optimization | Ant Morphology (test) | Median Normalized Score0.567 | 16 | |
| Discrete Optimization | TF Bind 8 | Median Normalized Score43.9 | 16 | |
| Discrete Optimization | TF Bind 10 | Median Normalized Score0.467 | 16 | |
| Offline Model-Based Optimization | Hopper Controller (test) | Median Normalized Score0.343 | 16 | |
| Offline Model-Based Optimization | Superconductor (test) | Median Normalized Score0.3 | 16 | |
| Offline Model-Based Optimization | D'Kitty Morphology Design-Bench | 100th Percentile Score89.6 | 15 | |
| Offline Model-Based Optimization | Ant Morphology Design-Bench | 100th Percentile Score0.819 | 15 | |
| Offline Model-Based Optimization | Hopper Controller Design-Bench | Score (100th Pctl)0.55 | 15 |