论文标题

通过一声多步树优化有效的非主管贝叶斯

Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees

论文作者

Jiang, Shali, Jiang, Daniel R., Balandat, Maximilian, Karrer, Brian, Gardner, Jacob R., Garnett, Roman

论文摘要

贝叶斯优化是一个顺序的决策框架,用于优化昂贵的黑框功能。计算完整的策略等于解决高度棘手的随机动态程序。近视方法(例如预期的改进)通常在实践中采用,但它们忽略了即时决定的长期影响。现有的非主管方法主要是启发式和/或计算上昂贵的。在本文中,我们提供了第一个有效的多步骤lookahead贝叶斯优化的有效实现,该优化是在多步骤树中表达为一系列嵌套优化问题的序列。我们没有以嵌套的方式解决这些问题,而是以``单声''的方式共同优化了完整树中的所有决策变量。将其与实施多步高斯流程的有效方法相结合,我们证明了多步期望的改进是可以计算上的,并且在广泛基准的现有方法上表现出优于现有方法的性能。

Bayesian optimization is a sequential decision making framework for optimizing expensive-to-evaluate black-box functions. Computing a full lookahead policy amounts to solving a highly intractable stochastic dynamic program. Myopic approaches, such as expected improvement, are often adopted in practice, but they ignore the long-term impact of the immediate decision. Existing nonmyopic approaches are mostly heuristic and/or computationally expensive. In this paper, we provide the first efficient implementation of general multi-step lookahead Bayesian optimization, formulated as a sequence of nested optimization problems within a multi-step scenario tree. Instead of solving these problems in a nested way, we equivalently optimize all decision variables in the full tree jointly, in a ``one-shot'' fashion. Combining this with an efficient method for implementing multi-step Gaussian process ``fantasization,'' we demonstrate that multi-step expected improvement is computationally tractable and exhibits performance superior to existing methods on a wide range of benchmarks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源