论文标题

最大值优化二进制回归

Maximin Optimization for Binary Regression

论文作者

Chiprut, Nisan, Globerson, Amir, Wiesel, Ami

论文摘要

我们考虑二进制重量的回归问题。这种优化问题在量化的学习模型和数字通信系统中无处不在。一种自然的方法是使用梯度上升方法的变体优化相应的Lagrangian。即使在凹形凸案例中,这种最大蛋白技术仍然很糟糕。非凸线二进制约束可能导致虚假的局部最小值。有趣的是,我们证明,这种方法在低噪声条件下的线性回归中是最佳的,并且具有少量异常值的稳健回归。实际上,该方法在回归方面的表现也很好,跨熵损失以及非凸多层神经网络。总而言之,我们的方法突出了鞍点优化对学习有限模型的潜力。

We consider regression problems with binary weights. Such optimization problems are ubiquitous in quantized learning models and digital communication systems. A natural approach is to optimize the corresponding Lagrangian using variants of the gradient ascent-descent method. Such maximin techniques are still poorly understood even in the concave-convex case. The non-convex binary constraints may lead to spurious local minima. Interestingly, we prove that this approach is optimal in linear regression with low noise conditions as well as robust regression with a small number of outliers. Practically, the method also performs well in regression with cross entropy loss, as well as non-convex multi-layer neural networks. Taken together our approach highlights the potential of saddle-point optimization for learning constrained models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源