论文标题

具有模型概念的加速和非加速随机梯度下降

Accelerated and nonaccelerated stochastic gradient descent with model conception

论文作者

Dvinskikh, Darina, Tyurin, Alexander, Gasnikov, Alexander, Omelchenko, Sergey

论文摘要

在本文中,我们描述了一种新方法,以获得平滑(强)凸优化任务中最佳方法的收敛率。我们的方法是基于梯度具有非随机小声音的任务的结果。与以前的结果不同,我们通过模型概念获得收敛速率。

In this paper, we describe a new way to get convergence rates for optimal methods in smooth (strongly) convex optimization tasks. Our approach is based on results for tasks where gradients have nonrandom small noises. Unlike previous results, we obtain convergence rates with model conception.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源