论文标题

一种分布式的立方规范化的牛顿方法,用于在网络上平滑凸优化

A Distributed Cubic-Regularized Newton Method for Smooth Convex Optimization over Networks

论文作者

Uribe, César A., Jadbabaie, Ali

论文摘要

我们提出了一种分布式的,立方体调节的牛顿方法,用于网络上的大规模凸优化。所提出的方法仅需要本地计算和通信,并且适用于任意网络拓扑的联合学习应用程序。当成本函数带有Lipschitz渐变和Hessian时,我们显示$ O(k^{{ - } 3})$收敛率,$ k $是迭代的数量。我们进一步为算法的每个步骤中所需的通信提供了依赖网络的界限。我们提供数值实验来验证我们的理论结果。

We propose a distributed, cubic-regularized Newton method for large-scale convex optimization over networks. The proposed method requires only local computations and communications and is suitable for federated learning applications over arbitrary network topologies. We show a $O(k^{{-}3})$ convergence rate when the cost function is convex with Lipschitz gradient and Hessian, with $k$ being the number of iterations. We further provide network-dependent bounds for the communication required in each step of the algorithm. We provide numerical experiments that validate our theoretical results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源