论文标题

熊猫:适应审核的特征以进行异常检测和分割

PANDA: Adapting Pretrained Features for Anomaly Detection and Segmentation

论文作者

Reiss, Tal, Cohen, Niv, Bergman, Liron, Hoshen, Yedid

论文摘要

异常检测方法需要高质量的特征。近年来,异常检测界试图使用深度自我监督的特征学习的进步来获得更好的功能。令人惊讶的是,使用经过预定的深度功能的一个非常有前途的方向被忽略了。在本文中,我们首先从经验上建立了可能的,但未报告的结果,即将预期的特征与简单的异常检测和分割方法相结合,令人信服地胜过表现,更复杂,最先进的方法。 为了在异常检测中获得进一步的性能增长,我们将预审慎的特征适应目标分布。尽管转移学习方法在多类分类问题中已经很好地确定,但单级分类(OCC)设置尚未很好地探索。事实证明,通常在监督学习中效果很好的天真适应方法通常会导致灾难性崩溃(功能降低)并在OCC设置中降低性能。一种流行的OCC方法,DeepSVDD,使用专门的体系结构提倡,但这限制了适应性性能的增长。我们提出了两种打击崩溃的方法:i)早期停止的变体,动态地学习停止迭代ii)弹性正则化受持续学习的启发。我们的方法,熊猫,在OCC,较大的曝光和异常分割设置中的最先进,大幅度的边缘。

Anomaly detection methods require high-quality features. In recent years, the anomaly detection community has attempted to obtain better features using advances in deep self-supervised feature learning. Surprisingly, a very promising direction, using pretrained deep features, has been mostly overlooked. In this paper, we first empirically establish the perhaps expected, but unreported result, that combining pretrained features with simple anomaly detection and segmentation methods convincingly outperforms, much more complex, state-of-the-art methods. In order to obtain further performance gains in anomaly detection, we adapt pretrained features to the target distribution. Although transfer learning methods are well established in multi-class classification problems, the one-class classification (OCC) setting is not as well explored. It turns out that naive adaptation methods, which typically work well in supervised learning, often result in catastrophic collapse (feature deterioration) and reduce performance in OCC settings. A popular OCC method, DeepSVDD, advocates using specialized architectures, but this limits the adaptation performance gain. We propose two methods for combating collapse: i) a variant of early stopping that dynamically learns the stopping iteration ii) elastic regularization inspired by continual learning. Our method, PANDA, outperforms the state-of-the-art in the OCC, outlier exposure and anomaly segmentation settings by large margins.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源