论文标题

用于对象检测的多任务增量学习

Multi-Task Incremental Learning for Object Detection

论文作者

Liu, Xialei, Yang, Hao, Ravichandran, Avinash, Bhotika, Rahul, Soatto, Stefano

论文摘要

多任务学习多个任务,同时共享知识和计算。但是,当不访问旧数据而逐步学习时,它会遭受灾难性忘记以前的知识。大多数现有的对象检测器是特定于域的和静态的,而有些是逐步学习的,但仅在单个域内学习。很少探索跨各个域的对象检测器逐步训练。在这项工作中,我们提出了各个域和类别的三个增量学习方案,以进行对象检测。为了减轻灾难性的遗忘,提出了细心的特征蒸馏以利用自下而上和自上而下的注意力,以提取重要信息进行蒸馏。然后,我们在不同情况下系统地分析了所提出的蒸馏方法。我们发现,与共同的理解相反,域间隙对增量检测的负面影响较小,而类别差异是有问题的。对于困难的情况,在域差距,尤其是类别差异很大的情况下,我们探讨了三种不同的示例采样方法,并表明所提出的自适应采样方法有效地从整个数据集中选择了不同的和信息的样本,以进一步防止遗忘。实验结果表明,在七个对象检测基准数据集中,我们在三种不同的情况下实现了显着改善。

Multi-task learns multiple tasks, while sharing knowledge and computation among them. However, it suffers from catastrophic forgetting of previous knowledge when learned incrementally without access to the old data. Most existing object detectors are domain-specific and static, while some are learned incrementally but only within a single domain. Training an object detector incrementally across various domains has rarely been explored. In this work, we propose three incremental learning scenarios across various domains and categories for object detection. To mitigate catastrophic forgetting, attentive feature distillation is proposed to leverages both bottom-up and top-down attentions to extract important information for distillation. We then systematically analyze the proposed distillation method in different scenarios. We find out that, contrary to common understanding, domain gaps have smaller negative impact on incremental detection, while category differences are problematic. For the difficult cases, where the domain gaps and especially category differences are large, we explore three different exemplar sampling methods and show the proposed adaptive sampling method is effective to select diverse and informative samples from entire datasets, to further prevent forgetting. Experimental results show that we achieve the significant improvement in three different scenarios across seven object detection benchmark datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源