论文标题
从所有人那里学习:删除嘈杂标签面部表达识别的注意力一致性
Learn From All: Erasing Attention Consistency for Noisy Label Facial Expression Recognition
论文作者
论文摘要
由于类间的相似性和注释歧义,嘈杂的标签面部表达识别(FER)比传统的嘈杂标签分类任务更具挑战性。最近的作品主要通过滤除大量损坏样本来解决此问题。在本文中,我们从新功能学习的角度探索了嘈杂的标签。我们发现,FER模型通过专注于可以认为与嘈杂标签相关的一部分来记住嘈杂的样本,而不是从导致潜在真理的整个功能中学习。受到启发,我们提出了一种新颖的擦除注意力一致性(EAC)方法,以自动抑制嘈杂的样本。具体来说,我们首先利用面部图像的翻转语义一致性来设计不平衡的框架。然后,我们随机删除输入图像并使用翻转注意一致性,以防止模型专注于部分特征。 EAC明显胜过最先进的噪声标签方法,并将其概括地概括为其他类似CIFAR100和Tiny-Imagenet等类的其他任务。该代码可在https://github.com/zyh-uaiaaaa/erasing-prestention-consistency中获得。
Noisy label Facial Expression Recognition (FER) is more challenging than traditional noisy label classification tasks due to the inter-class similarity and the annotation ambiguity. Recent works mainly tackle this problem by filtering out large-loss samples. In this paper, we explore dealing with noisy labels from a new feature-learning perspective. We find that FER models remember noisy samples by focusing on a part of the features that can be considered related to the noisy labels instead of learning from the whole features that lead to the latent truth. Inspired by that, we propose a novel Erasing Attention Consistency (EAC) method to suppress the noisy samples during the training process automatically. Specifically, we first utilize the flip semantic consistency of facial images to design an imbalanced framework. We then randomly erase input images and use flip attention consistency to prevent the model from focusing on a part of the features. EAC significantly outperforms state-of-the-art noisy label FER methods and generalizes well to other tasks with a large number of classes like CIFAR100 and Tiny-ImageNet. The code is available at https://github.com/zyh-uaiaaaa/Erasing-Attention-Consistency.