论文标题

为增量和弱监督语义分割建模背景

Modeling the Background for Incremental and Weakly-Supervised Semantic Segmentation

论文作者

Cermelli, Fabio, Mancini, Massimiliano, Buló, Samuel Rota, Ricci, Elisa, Caputo, Barbara

论文摘要

深度神经网络已使语义细分方面取得了重大进展。但是,即使是最先进的神经体系结构也受到重要局限性。首先,它们容易受到灾难性遗忘的攻击,即,在需要新的课程时,它们需要逐步更新模型时的表现不佳。其次,他们依靠大量像素级注释来产生准确的分割图。为了解决这些问题,我们介绍了一种新颖的增量课程学习方法,以考虑到这项任务的特殊方面:由于每个训练步骤仅为所有可能的类别的子集提供注释,因此背景类的像素表现出语义转移。因此,我们通过设计新的损失术语来重新审视传统的蒸馏范式,这些损失术语明确说明了背景转移。此外,我们引入了一种新型策略,以在每个步骤中初始化分类器的参数初始化,以防止对背景类别的偏差预测。最后,我们证明我们的方法可以扩展到基于点和涂鸦的弱监督细分,对部分注释进行建模以创建未标记像素的先验。我们通过对Pascal-Voc,ADE20K和CityScapes数据集进行广泛评估来证明我们的方法的有效性,这表现明显优于最先进的方法。

Deep neural networks have enabled major progresses in semantic segmentation. However, even the most advanced neural architectures suffer from important limitations. First, they are vulnerable to catastrophic forgetting, i.e. they perform poorly when they are required to incrementally update their model as new classes are available. Second, they rely on large amount of pixel-level annotations to produce accurate segmentation maps. To tackle these issues, we introduce a novel incremental class learning approach for semantic segmentation taking into account a peculiar aspect of this task: since each training step provides annotation only for a subset of all possible classes, pixels of the background class exhibit a semantic shift. Therefore, we revisit the traditional distillation paradigm by designing novel loss terms which explicitly account for the background shift. Additionally, we introduce a novel strategy to initialize classifier's parameters at each step in order to prevent biased predictions toward the background class. Finally, we demonstrate that our approach can be extended to point- and scribble-based weakly supervised segmentation, modeling the partial annotations to create priors for unlabeled pixels. We demonstrate the effectiveness of our approach with an extensive evaluation on the Pascal-VOC, ADE20K, and Cityscapes datasets, significantly outperforming state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源