论文标题

在Polyak下使用线性收敛的量化分布式非Convex优化算法 - $ $ ojasiewicz条件

Quantized Distributed Nonconvex Optimization Algorithms with Linear Convergence under the Polyak--$Ł$ojasiewicz Condition

论文作者

Xu, Lei, Yi, Xinlei, Sun, Jiayue, Shi, Yang, Johansson, Karl H., Yang, Tao

论文摘要

本文考虑分布式优化,以最大程度地降低本地非convex成本函数的平均值,通过在无向通信网络上使用本地信息交换。为了降低所需的沟通能力,我们引入了编码器 - 二十码方案。通过分别将它们与分布式梯度跟踪和比例积分算法集成,然后我们提出了两种量化的分布式非convex优化算法。假设全球成本函数满足Polyak-olojasiewicz条件,该条件不需要全球成本函数是凸的,并且全球最小化不一定是唯一的,我们表明我们提出的算法线性收敛到全球最佳点,并且更大的量化水平会导致更高的融合速度。此外,我们表明,当正确选择算法参数时,低数据速率足以保证线性收敛。理论结果由数值示例说明。

This paper considers distributed optimization for minimizing the average of local nonconvex cost functions, by using local information exchange over undirected communication networks. To reduce the required communication capacity, we introduce an encoder--decoder scheme. By integrating them with distributed gradient tracking and proportional integral algorithms, respectively, we then propose two quantized distributed nonconvex optimization algorithms. Assuming the global cost function satisfies the Polyak--Łojasiewicz condition, which does not require the global cost function to be convex and the global minimizer is not necessarily unique, we show that our proposed algorithms linearly converge to a global optimal point and that larger quantization level leads to faster convergence speed. Moreover, we show that a low data rate is sufficient to guarantee linear convergence when the algorithm parameters are properly chosen. The theoretical results are illustrated by numerical examples.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源