论文标题

在生成建模中应用基于Schrödinger-Bridge的随机过程

Applying Regularized Schrödinger-Bridge-Based Stochastic Process in Generative Modeling

论文作者

Song, Ki-Ung

论文摘要

与深入生成建模中现有的基于功能的模型相比,最近提出的扩散模型通过基于随机过程的方法实现了出色的性能。但是,由于多个时间段的离散时间,这种方法需要长时间的采样时间。基于Schrödinger桥(SB)的模型试图通过训练分布之间的双向随机过程来解决此问题。但是,与生成对抗网络等生成模型相比,它们仍然具有缓慢的采样速度。由于训练双向随机过程,它们需要相对较长的训练时间。因此,这项研究试图减少所需的时间段和训练时间的数量,并向现有的SB模型提出了正则化项,以使双向随机过程保持一致且稳定,并稳定,并减少了时间表。每个正则化项都集成到一个术语中,以实现计算时间和内存使用情况中更有效的培训。将此正则随机过程应用于各种一代任务,获得了不同分布之间的所需翻译,因此,可以确认基于具有更快采样速度的随机过程的生成建模的可能性。该代码可在https://github.com/kiungsong/rsb上找到。

Compared to the existing function-based models in deep generative modeling, the recently proposed diffusion models have achieved outstanding performance with a stochastic-process-based approach. But a long sampling time is required for this approach due to many timesteps for discretization. Schrödinger bridge (SB)-based models attempt to tackle this problem by training bidirectional stochastic processes between distributions. However, they still have a slow sampling speed compared to generative models such as generative adversarial networks. And due to the training of the bidirectional stochastic processes, they require a relatively long training time. Therefore, this study tried to reduce the number of timesteps and training time required and proposed regularization terms to the existing SB models to make the bidirectional stochastic processes consistent and stable with a reduced number of timesteps. Each regularization term was integrated into a single term to enable more efficient training in computation time and memory usage. Applying this regularized stochastic process to various generation tasks, the desired translations between different distributions were obtained, and accordingly, the possibility of generative modeling based on a stochastic process with faster sampling speed could be confirmed. The code is available at https://github.com/KiUngSong/RSB.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源