论文标题

贝叶斯任务嵌入几次贝叶斯优化

Bayesian task embedding for few-shot Bayesian optimization

论文作者

Atkinson, Steven, Ghosh, Sayan, Chennimalai-Kumar, Natarajan, Khan, Genghis, Wang, Liping

论文摘要

我们描述了一种贝叶斯优化的方法,通过该方法可以将其定量相互关系的多个系统中的数据合并为先验。该系统的所有通用(非真实价值)特征都与连续的延伸变量相关联,这些变量将作为输入输入到单个元模型中,同时了解所有系统的响应表面。贝叶斯推论用于确定有关潜在变量的适当信念。我们解释了如何将所得概率的元模型用于贝叶斯优化任务,并在各种合成和现实世界中证明其实现,并比较其在零,一,一单和很少的设置下与传统贝叶斯优化的效果下的性能,这通常需要更多来自感兴趣的系统的数据。

We describe a method for Bayesian optimization by which one may incorporate data from multiple systems whose quantitative interrelationships are unknown a priori. All general (nonreal-valued) features of the systems are associated with continuous latent variables that enter as inputs into a single metamodel that simultaneously learns the response surfaces of all of the systems. Bayesian inference is used to determine appropriate beliefs regarding the latent variables. We explain how the resulting probabilistic metamodel may be used for Bayesian optimization tasks and demonstrate its implementation on a variety of synthetic and real-world examples, comparing its performance under zero-, one-, and few-shot settings against traditional Bayesian optimization, which usually requires substantially more data from the system of interest.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源