Nonmyopic Global Optimisation via Approximate Dynamic Programming
About
Global optimisation to optimise expensive-to-evaluate black-box functions without gradient information. Bayesian optimisation, one of the most well-known techniques, typically employs Gaussian processes as surrogate models, leveraging their probabilistic nature to balance exploration and exploitation. However, these processes become computationally prohibitive in high-dimensional spaces. Recent alternatives, based on inverse distance weighting (IDW) and radial basis functions (RBFs), offer competitive, computationally lighter solutions. Despite their efficiency, both traditional global and Bayesian optimisation strategies suffer from the myopic nature of their acquisition functions, which focus on immediate improvement neglecting future implications of the sequential decision making process. Nonmyopic acquisition functions devised for the Bayesian setting have shown promise in improving long-term performance. Yet, their combination with deterministic surrogate models remains unexplored. In this work, we introduce novel nonmyopic acquisition strategies tailored to IDW and RBF based on approximate dynamic programming paradigms, including rollout and multi-step scenario-based optimisation schemes, to enable lookahead acquisition. These methods optimise a sequence of query points over a horizon by predicting the evolution of the surrogate model, inherently managing the exploration-exploitation trade-off via optimisation techniques. The proposed approach represents a significant advance in extending nonmyopic acquisition principles, previously confined to Bayesian optimisation, to deterministic models. Empirical results on synthetic and hyperparameter tuning benchmark problems, a constrained problem, as well as on a data-driven predictive control application, demonstrate that these nonmyopic methods outperform conventional myopic approaches, leading to faster and more robust convergence.
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Global Optimization | DropWave | Mean Objective Value0.63 | 23 | |
| Global Optimization | Brochu function 2d | Mean Final Optimality Gap0.744 | 13 | |
| Global Optimization | Bukin function | Mean Final Optimality Gap74.7 | 13 | |
| Global Optimization | Dixon-Price function 4d | Mean Optimality Gap0.905 | 13 | |
| Global Optimization | Ackley function | Mean Final Optimality Gap0.787 | 13 | |
| Global Optimization | Adjiman function | Mean Final Optimality Gap0.885 | 13 | |
| Global Optimization | Beale function | Mean Optimality Gap0.818 | 13 | |
| Global Optimization | Bohachevsky function | Mean Final Optimality Gap0.966 | 13 | |
| Global Optimization | Brochu (4d) function | Mean Final Optimality Gap0.628 | 13 | |
| Global Optimization | Camel hump function 3d | Mean Optimality Gap86.2 | 13 |