论文标题

速率得到改善的不精确的增强拉格朗日方法,用于约束非convex优化

Rate-improved Inexact Augmented Lagrangian Method for Constrained Nonconvex Optimization

论文作者

Li, Zichong, Chen, Pin-Yu, Liu, Sijia, Lu, Songtao, Xu, Yangyang

论文摘要

在增强拉格朗日方法(ALM)或惩罚方法的框架内,已经研究了一阶方法的非线性约束优化。我们提出了改进的不精确ALM(IALM),并针对具有仿射平等或非凸约限制的非凸问题进行统一分析。在某些规律性条件下(现有作品也假定),我们显示了一个$ \ tilde {o}(\ varepsilon^{ - \ \ \ \ frac {5} {2}}} $复杂性,这是由于非convex目标和非convex物镜的问题而产生的复杂性,并且a $ \ tilde confimple and $ \ tilde confimple confimple confimple conffection complace complace a $} $ {\ vareps a $}(\ vareps)非convex目标和非convex约束,其中复杂性是通过一阶甲壳的数量来测量的,以产生$ \ varepsilon $ -KKT解决方案。两种结果都是最著名的。通过惩罚方法,已经实现了相同阶的复杂性结果。但是,使用两种不同的分析技术来获得结果,更重要的是,在实践中,惩罚方法通常比IALM差得多。我们改进的IALM和分析弥补了理论与实践之间的差距。提供了有关仿射平等或非凸约限制的非凸问题的数值实验,以证明我们提出的方法的有效性。

First-order methods have been studied for nonlinear constrained optimization within the framework of the augmented Lagrangian method (ALM) or penalty method. We propose an improved inexact ALM (iALM) and conduct a unified analysis for nonconvex problems with either affine equality or nonconvex constraints. Under certain regularity conditions (that are also assumed by existing works), we show an $\tilde{O}(\varepsilon^{-\frac{5}{2}})$ complexity result for a problem with a nonconvex objective and affine equality constraints and an $\tilde{O}(\varepsilon^{-3})$ complexity result for a problem with a nonconvex objective and nonconvex constraints, where the complexity is measured by the number of first-order oracles to yield an $\varepsilon$-KKT solution. Both results are the best known. The same-order complexity results have been achieved by penalty methods. However, two different analysis techniques are used to obtain the results, and more importantly, the penalty methods generally perform significantly worse than iALM in practice. Our improved iALM and analysis close the gap between theory and practice. Numerical experiments on nonconvex problems with affine equality or nonconvex constraints are provided to demonstrate the effectiveness of our proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源