论文标题

无需替代序列模型的增量采样

Incremental Sampling Without Replacement for Sequence Models

论文作者

Shi, Kensen, Bieber, David, Sutton, Charles

论文摘要

采样是一种基本技术,当重复样品无益时,通常不需要无需替代的采样。在机器学习中,采样可用于从训练有素的模型中产生各种输出。我们提供了一个优雅的步骤,用于采样,而无需替换一系列随机程序,包括依次构建输出的生成神经模型。即使对于指数级的输出空间,我们的过程也是有效的。与先前的工作不同,我们的方法是渐进的,即可以一次绘制样品,从而提高灵活性。我们还提出了一个新的估计量,用于计算未替代的样品的预期。我们表明,没有替换的增量采样适用于许多域,例如程序合成和组合优化。

Sampling is a fundamental technique, and sampling without replacement is often desirable when duplicate samples are not beneficial. Within machine learning, sampling is useful for generating diverse outputs from a trained model. We present an elegant procedure for sampling without replacement from a broad class of randomized programs, including generative neural models that construct outputs sequentially. Our procedure is efficient even for exponentially-large output spaces. Unlike prior work, our approach is incremental, i.e., samples can be drawn one at a time, allowing for increased flexibility. We also present a new estimator for computing expectations from samples drawn without replacement. We show that incremental sampling without replacement is applicable to many domains, e.g., program synthesis and combinatorial optimization.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源