论文标题

不断的深入学习,通过使过去的功能正规化

Continual Deep Learning by Functional Regularisation of Memorable Past

论文作者

Pan, Pingbo, Swaroop, Siddharth, Immer, Alexander, Eschenhagen, Runa, Turner, Richard E., Khan, Mohammad Emtiyaz

论文摘要

不断学习新技能对于智能系统很重要,但是标准的深度学习方法却遭受了对过去的灾难性遗忘。最近的作品通过重量正则化解决了这一点。功能正则化虽然计算昂贵,但预计会表现更好,但实际上很少会这样做。在本文中,我们通过使用一种使用新的功能性调查方法来解决此问题,该方法利用了一些令人难忘的过去示例来避免忘记。通过使用深层网络的高斯过程公式,我们的方法可以在重量空间中进行培训,同时识别令人难忘的过去和功能性的先验。我们的方法在标准基准测试中实现了最新的性能,并为终身学习打开了一个新的方向,即自然合并了正则化和基于内存的方法。

Continually learning new skills is important for intelligent systems, yet standard deep learning methods suffer from catastrophic forgetting of the past. Recent works address this with weight regularisation. Functional regularisation, although computationally expensive, is expected to perform better, but rarely does so in practice. In this paper, we fix this issue by using a new functional-regularisation approach that utilises a few memorable past examples crucial to avoid forgetting. By using a Gaussian Process formulation of deep networks, our approach enables training in weight-space while identifying both the memorable past and a functional prior. Our method achieves state-of-the-art performance on standard benchmarks and opens a new direction for life-long learning where regularisation and memory-based methods are naturally combined.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源