论文标题

具有晚期重量的神经网络

Neural networks with late-phase weights

论文作者

von Oswald, Johannes, Kobayashi, Seijin, Meulemans, Alexander, Henning, Christian, Grewe, Benjamin F., Sacramento, João

论文摘要

训练神经网络的成功成功方法是使用一些随机梯度下降(SGD)学习其权重。在这里,我们表明,通过在学习的后期结合体重的一部分,可以进一步改善SGD的解决方案。在学习结束时,我们通过在体重空间中取平均值来获得单个模型。为了避免产生增加的计算成本,我们研究了一个低维后期权重模型的家族,该家族与其余参数倍增。我们的结果表明,增强具有晚期权重的标准模型可改善诸如CIFAR-10/100,ImageNet和Enwik8之类的已建立基准测试。这些发现与嘈杂的二次问题的理论分析相辅相成,该分析简化了神经网络学习的后期。

The largely successful method of training neural networks is to learn their weights using some variant of stochastic gradient descent (SGD). Here, we show that the solutions found by SGD can be further improved by ensembling a subset of the weights in late stages of learning. At the end of learning, we obtain back a single model by taking a spatial average in weight space. To avoid incurring increased computational costs, we investigate a family of low-dimensional late-phase weight models which interact multiplicatively with the remaining parameters. Our results show that augmenting standard models with late-phase weights improves generalization in established benchmarks such as CIFAR-10/100, ImageNet and enwik8. These findings are complemented with a theoretical analysis of a noisy quadratic problem which provides a simplified picture of the late phases of neural network learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源