论文标题

在牛顿放映上

On Newton Screening

论文作者

Huang, Jian, Jiao, Yuling, Kang, Lican, Liu, Jin, Liu, Yanyan, Lu, Xiliang, Yang, Yuanyuan

论文摘要

Screening and working set techniques are important approaches to reducing the size of an optimization problem. They have been widely used in accelerating first-order methods for solving large-scale sparse learning problems.在本文中,我们开发了一种称为牛顿筛查(NS)的新筛选方法,该方法是一种具有内置筛选机制的广义牛顿方法。 We derive an equivalent KKT system for the Lasso and utilize a generalized Newton method to solve the KKT equations.基于此KKT系统,首先使用从上一个迭代产生的原始和双变量的总和确定具有相对尺寸的内置工作集,然后通过在工作集上解决最小二乘问题以及基于封闭形式的表达式更新的双变量来更新原始变量。 Moreover, we consider a sequential version of Newton screening (SNS) with a warm-start strategy. We show that NS possesses an optimal convergence property in the sense that it achieves one-step local convergence.在特征矩阵上的某些规律性条件下,我们表明SNS键入具有与基础真实目标相同的符号的解决方案,并达到具有高概率的尖锐估计误差。模拟研究和实际数据分析支持我们的理论结果,并证明SNS比我们的比较研究中的几种最新方法更快,更准确。

Screening and working set techniques are important approaches to reducing the size of an optimization problem. They have been widely used in accelerating first-order methods for solving large-scale sparse learning problems. In this paper, we develop a new screening method called Newton screening (NS) which is a generalized Newton method with a built-in screening mechanism. We derive an equivalent KKT system for the Lasso and utilize a generalized Newton method to solve the KKT equations. Based on this KKT system, a built-in working set with a relatively small size is first determined using the sum of primal and dual variables generated from the previous iteration, then the primal variable is updated by solving a least-squares problem on the working set and the dual variable updated based on a closed-form expression. Moreover, we consider a sequential version of Newton screening (SNS) with a warm-start strategy. We show that NS possesses an optimal convergence property in the sense that it achieves one-step local convergence. Under certain regularity conditions on the feature matrix, we show that SNS hits a solution with the same signs as the underlying true target and achieves a sharp estimation error bound with high probability. Simulation studies and real data analysis support our theoretical results and demonstrate that SNS is faster and more accurate than several state-of-the-art methods in our comparative studies.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源