论文标题

通过对抗性阶级存储的持续几次学习

Continual Few-Shot Learning with Adversarial Class Storage

论文作者

Wu, Kun, Yin, Chengxiang, Tang, Jian, Xu, Zhiyuan, Wang, Yanzhi, Yang, Dejun

论文摘要

人类具有出色的能力,可以连续地快速有效地学习新概念而不忘记旧知识。尽管深度学习在各种计算机视觉任务上取得了巨大的成功,但它面临实现这种人类水平智能的挑战。在本文中,我们定义了一个称为连续几次学习的新问题,其中任务依次到达,每个任务都与一些培训样本相关联。我们建议连续的元学习者(CML)解决这个问题。 CML将基于度量的分类和基于内存的机制以及对抗性学习整合到元学习框架中,这导致了理想的属性:1)它可以快速有效地学习处理新任务; 2)它克服了灾难性的遗忘; 3)它是模型不合时宜的。我们在两个图像数据集(Miniimagenet和Cifar100)上进行了广泛的实验。实验结果表明,CML在没有灾难性遗忘的几次学习任务上,在分类精度方面提供了最先进的表现。

Humans have a remarkable ability to quickly and effectively learn new concepts in a continuous manner without forgetting old knowledge. Though deep learning has made tremendous successes on various computer vision tasks, it faces challenges for achieving such human-level intelligence. In this paper, we define a new problem called continual few-shot learning, in which tasks arrive sequentially and each task is associated with a few training samples. We propose Continual Meta-Learner (CML) to solve this problem. CML integrates metric-based classification and a memory-based mechanism along with adversarial learning into a meta-learning framework, which leads to the desirable properties: 1) it can quickly and effectively learn to handle a new task; 2) it overcomes catastrophic forgetting; 3) it is model-agnostic. We conduct extensive experiments on two image datasets, MiniImageNet and CIFAR100. Experimental results show that CML delivers state-of-the-art performance in terms of classification accuracy on few-shot learning tasks without catastrophic forgetting.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源