论文标题

光滑游戏的随机汉密尔顿梯度方法

Stochastic Hamiltonian Gradient Methods for Smooth Games

论文作者

Loizou, Nicolas, Berard, Hugo, Jolicoeur-Martineau, Alexia, Vincent, Pascal, Lacoste-Julien, Simon, Mitliagkas, Ioannis

论文摘要

机器学习中对抗性配方的成功为平滑游戏带来了新的动力。在这项工作中,我们专注于随机的哈密顿方法的类别,并为某些类别的随机平滑游戏提供了第一个融合保证。我们提出了一个新型的无偏估计量,用于随机哈密顿梯度下降(SHGD),并强调其益处。使用优化文献中的工具,我们表明SHGD线性收敛到固定点的附近。为了确保与精确溶液的收敛性,我们分析了SHGD,以降低的步进尺寸分析SHGD,并且还提出了第一个随机方差降低的哈密顿方法。我们的结果为随机不受约束的双线性游戏类别以及更一般的随机游戏提供了第一个全球非近视的最后近期收敛保证,这些游戏满足了“足够双线性”条件,特别是包括某些非convex非concave问题。我们通过对随机双线性和足够双线性游戏的实验来补充我们的分析,其中我们的理论被证明是紧密的,以及简单的对抗机器学习公式。

The success of adversarial formulations in machine learning has brought renewed motivation for smooth games. In this work, we focus on the class of stochastic Hamiltonian methods and provide the first convergence guarantees for certain classes of stochastic smooth games. We propose a novel unbiased estimator for the stochastic Hamiltonian gradient descent (SHGD) and highlight its benefits. Using tools from the optimization literature we show that SHGD converges linearly to the neighbourhood of a stationary point. To guarantee convergence to the exact solution, we analyze SHGD with a decreasing step-size and we also present the first stochastic variance reduced Hamiltonian method. Our results provide the first global non-asymptotic last-iterate convergence guarantees for the class of stochastic unconstrained bilinear games and for the more general class of stochastic games that satisfy a "sufficiently bilinear" condition, notably including some non-convex non-concave problems. We supplement our analysis with experiments on stochastic bilinear and sufficiently bilinear games, where our theory is shown to be tight, and on simple adversarial machine learning formulations.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源