论文标题

注意图像分类的注意力嘈杂的标签学习

Attention-Aware Noisy Label Learning for Image Classification

论文作者

Wang, Zhenzhen, Xu, Chunyan, Tan, Yap-Peng, Yuan, Junsong

论文摘要

深度卷积神经网络(CNN)在大规模标记的样本上学到的,在计算机视觉上取得了显着进展,例如图像/视频分类。获得大量标记的视觉数据的最便宜方法是从带有用户提供标签的网站(例如Flickr)爬网。但是,这些样品通常倾向于包含不正确的标签(即嘈杂的标签),这会大大降低网络性能。在本文中,提出了引起注意的嘈杂标签学习方法($ a^2nl $),以提高具有潜在标签噪声的数据集训练的网络的歧视能力。具体而言,包含多个特定于噪声的单元的噪声注意模型旨在更好地捕获嘈杂的信息。预计每个单元将学习图像子集的特定噪声分布,以便更精确地建模不同的干扰。此外,引入了一个递归学习过程,以利用学到的高级知识来增强注意力网络的学习能力。为了充分评估所提出的方法,我们从两个方面进行了实验:大规模图像分类数据集(包括CIFAR-10,SVHN)上手动翻转标签噪声;以及具有多个属性的在线爬行服装数据集上的现实世界标签噪声。优于最先进方法的优越结果验证了我们提出的方法的有效性。

Deep convolutional neural networks (CNNs) learned on large-scale labeled samples have achieved remarkable progress in computer vision, such as image/video classification. The cheapest way to obtain a large body of labeled visual data is to crawl from websites with user-supplied labels, such as Flickr. However, these samples often tend to contain incorrect labels (i.e. noisy labels), which will significantly degrade the network performance. In this paper, the attention-aware noisy label learning approach ($A^2NL$) is proposed to improve the discriminative capability of the network trained on datasets with potential label noise. Specifically, a Noise-Attention model, which contains multiple noise-specific units, is designed to better capture noisy information. Each unit is expected to learn a specific noisy distribution for a subset of images so that different disturbances are more precisely modeled. Furthermore, a recursive learning process is introduced to strengthen the learning ability of the attention network by taking advantage of the learned high-level knowledge. To fully evaluate the proposed method, we conduct experiments from two aspects: manually flipped label noise on large-scale image classification datasets, including CIFAR-10, SVHN; and real-world label noise on an online crawled clothing dataset with multiple attributes. The superior results over state-of-the-art methods validate the effectiveness of our proposed approach.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源