论文标题

对Stein变分梯度下降的非反应分析

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

论文作者

Korba, Anna, Salim, Adil, Arbel, Michael, Luise, Giulia, Gretton, Arthur

论文摘要

我们研究了Stein变分梯度下降(SVGD)算法,该算法优化了一组粒子以近似于目标概率分布$ \ propto e^{ - v} $上的$ \ mathbb {r}^d $。在人口限制中,SVGD在KL差异的概率分布空间中执行梯度下降,相对于$π$,在该$π$中,梯度通过内核积分运算符平滑。在本文中,我们为SVGD算法提供了新颖的有限时间分析。我们提供了一个下降引理,以表明该算法在每次迭代时都会降低目标,并且平均Stein Fisher Divergence的收敛速率(也称为内核Stein差异)。我们还提供了有限粒子系统的收敛结果,该系统与SVGD对其人口版本的实际实现相对应。

We study the Stein Variational Gradient Descent (SVGD) algorithm, which optimises a set of particles to approximate a target probability distribution $π\propto e^{-V}$ on $\mathbb{R}^d$. In the population limit, SVGD performs gradient descent in the space of probability distributions on the KL divergence with respect to $π$, where the gradient is smoothed through a kernel integral operator. In this paper, we provide a novel finite time analysis for the SVGD algorithm. We provide a descent lemma establishing that the algorithm decreases the objective at each iteration, and rates of convergence for the average Stein Fisher divergence (also referred to as Kernel Stein Discrepancy). We also provide a convergence result of the finite particle system corresponding to the practical implementation of SVGD to its population version.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源