论文标题

一种稀疏近似的主动设定方法。第一部分:可分离$ \ ell_1 $项

An active-set method for sparse approximations. Part I: Separable $\ell_1$ terms

论文作者

Pougkakiotis, Spyridon, Gondzio, Jacek, Kalogerias, Dionysios S.

论文摘要

在本文中,我们提出了一种用于解决$ \ ell_1 $ regularized凸二次优化问题的活性集方法。它是通过将乘数近端方法(PMM)策略与标准半植物牛顿方法(SSN)相结合的。使用Krylov-Subspace方法求解所得的线性系统,该方法由某些通用预处理程序加速,这些通用预处理与近端参数相对于最佳。通过使用近端交替方向方法加热算法,进一步提高了实践效率。我们表明,外部PMM在仅仅可行性假设下实现了全局收敛。在其他标准假设下,PMM方案实现了全局线性和局部超级线性收敛。假设其相关的线性系统已经足够准确,并且在某些附加的规律性假设下,SSN方案是局部超线性收敛的。我们提供了数值证据,以通过将方法与几种弹性网线性回归和$ l^1 $ regultion PDE约束的优化问题进行比较,以证明该方法的有效性。

In this paper we present an active-set method for the solution of $\ell_1$-regularized convex quadratic optimization problems. It is derived by combining a proximal method of multipliers (PMM) strategy with a standard semismooth Newton method (SSN). The resulting linear systems are solved using a Krylov-subspace method, accelerated by certain general-purpose preconditioners which are shown to be optimal with respect to the proximal parameters. Practical efficiency is further improved by warm-starting the algorithm using a proximal alternating direction method of multipliers. We show that the outer PMM achieves global convergence under mere feasibility assumptions. Under additional standard assumptions, the PMM scheme achieves global linear and local superlinear convergence. The SSN scheme is locally superlinearly convergent, assuming that its associated linear systems are solved accurately enough, and globally convergent under certain additional regularity assumptions. We provide numerical evidence to demonstrate the effectiveness of the approach by comparing it against OSQP and IP-PMM (an ADMM and a regularized IPM solver, respectively) on several elastic-net linear regression and $L^1$-regularized PDE-constrained optimization problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源