论文标题

mnemonics培训:多级增量学习而无需忘记

Mnemonics Training: Multi-Class Incremental Learning without Forgetting

论文作者

Liu, Yaoyao, Su, Yuting, Liu, An-An, Schiele, Bernt, Sun, Qianru

论文摘要

多级增量学习(MCIL)旨在通过逐步更新对先前概念训练的模型来学习新概念。但是,在没有灾难性忘记以前的概念的情况下有效地学习新概念的固有权衡是一个内在的权衡。为了减轻这个问题,已经提议将一些概念的一些例子保留在很大程度上取决于这些示例的代表性。本文提出了一个新颖的自动框架,我们称为助记符,在其中我们对示例进行参数化并以端到端的方式进行优化。我们通过二线优化训练框架,即模型级别和示例级别。我们对三个MCIL基准CIFAR-100,ImageNet-Subset和Imagenet进行了广泛的实验,并表明使用Mnemonics典范可以超过最大的余量。有趣的是,有趣的是,助记符典型典型往往是在不同类别之间的边界上。

Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between different classes.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源