论文标题

课堂学习:图像分类的调查和绩效评估

Class-incremental learning: survey and performance evaluation on image classification

论文作者

Masana, Marc, Liu, Xialei, Twardowski, Bartlomiej, Menta, Mikel, Bagdanov, Andrew D., van de Weijer, Joost

论文摘要

对于将来的学习系统,值得逐步学习,因为它允许:通过消除新数据到达从头开始重新审阅的有效资源使用;通过防止或限制存储所需的数据量来减少内存使用量 - 当施加隐私限制时也很重要;并学习更类似于人类学习。增量学习的主要挑战是灾难性的遗忘,这是指在学习新任务后先前学习的任务的绩效下降。近年来,深度神经网络的增量学习已经爆炸性增长。最初的工作重点是任务收入学习,其中推理时提供了任务ID。最近,我们看到了向课堂学习学习的转变,学习者必须在以前的任务中所见的所有类别的推理时间进行歧视,而无需求助于任务ID。在本文中,我们对现有的类图像分类进行了全面调查,尤其是对13种课程信息的方法进行了广泛的实验评估。我们考虑了几种新的实验场景,包括对多个大型图像分类数据集上的类新方法进行比较,对小型和大型域移动的研究以及对各种网络架构的比较。

For future learning systems, incremental learning is desirable because it allows for: efficient resource usage by eliminating the need to retrain from scratch at the arrival of new data; reduced memory usage by preventing or limiting the amount of data required to be stored -- also important when privacy limitations are imposed; and learning that more closely resembles human learning. The main challenge for incremental learning is catastrophic forgetting, which refers to the precipitous drop in performance on previously learned tasks after learning a new one. Incremental learning of deep neural networks has seen explosive growth in recent years. Initial work focused on task-incremental learning, where a task-ID is provided at inference time. Recently, we have seen a shift towards class-incremental learning where the learner must discriminate at inference time between all classes seen in previous tasks without recourse to a task-ID. In this paper, we provide a complete survey of existing class-incremental learning methods for image classification, and in particular, we perform an extensive experimental evaluation on thirteen class-incremental methods. We consider several new experimental scenarios, including a comparison of class-incremental methods on multiple large-scale image classification datasets, an investigation into small and large domain shifts, and a comparison of various network architectures.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源