论文标题

最佳和实用算法,用于平滑且强烈凸出的分散优化

Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization

论文作者

Kovalev, Dmitry, Salim, Adil, Richtárik, Peter

论文摘要

我们考虑了在网络节点上存储的平滑强烈凸功能总和的分散最小化的任务。对于这个问题,最近已证明了梯度计算数量的下限以及实现$ \ varepsilon $精度所需的通信巡回赛数量。我们为这个分散的优化问题提出了两种新算法,并为它们提供了复杂性的保证。我们表明,就通信循环数量和梯度计算的数量而言,我们的第一种方法都是最佳的。与现有的最佳算法不同,我们的算法不依赖于对双梯度的昂贵评估。在没有对数因素的情况下,我们的第二个算法在通信回合的数量方面是最佳的。我们的方法依赖于将两种提出的算法视为向前向后算法的加速变体,以求解与分散的优化问题相关的单调包含物。我们还通过数值实验验证了针对最新算法的方法的功效。

We consider the task of decentralized minimization of the sum of smooth strongly convex functions stored across the nodes of a network. For this problem, lower bounds on the number of gradient computations and the number of communication rounds required to achieve $\varepsilon$ accuracy have recently been proven. We propose two new algorithms for this decentralized optimization problem and equip them with complexity guarantees. We show that our first method is optimal both in terms of the number of communication rounds and in terms of the number of gradient computations. Unlike existing optimal algorithms, our algorithm does not rely on the expensive evaluation of dual gradients. Our second algorithm is optimal in terms of the number of communication rounds, without a logarithmic factor. Our approach relies on viewing the two proposed algorithms as accelerated variants of the Forward Backward algorithm to solve monotone inclusions associated with the decentralized optimization problem. We also verify the efficacy of our methods against state-of-the-art algorithms through numerical experiments.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源