论文标题

一位半监督的进步老师提高面部表情识别

Boosting Facial Expression Recognition by A Semi-Supervised Progressive Teacher

论文作者

Jiang, Jing, Deng, Weihong

论文摘要

在本文中,我们旨在通过利用半监督的学习来提高野外面部表情识别(FER)的表现。大规模标记的数据和深度学习方法极大地改善了图像识别的性能。但是,由于缺乏训练数据和不正确的注释(例如,标签噪声),FER的性能仍然不理想。在现有的野外FER数据集中,可靠的数据集包含不足的数据来训练可靠的深层模型,而大规模的模型则以较低的质量注释。为了解决这个问题,我们提出了一种名为“进步教师”的半监督学习算法,以利用可靠的FER数据集以及大规模的无标记表达图像进行有效的培训。一方面,PT引入了半监督的学习方法,以减轻FER中的数据短缺。另一方面,它选择有用的标记训练样本自动,逐步减轻标签噪声。 PT使用选定的干净标记数据来计算监督分类损失和未标记的数据,以实现无监督的一致性损失。对广泛使用的数据库和FERPLUS进行的实验验证了我们方法的有效性,该方法在RAF-DB上以89.57%的精度实现了最先进的性能。此外,当合成噪声速率甚至达到30%时,我们的PT算法的性能仅会降低4.37%。

In this paper, we aim to improve the performance of in-the-wild Facial Expression Recognition (FER) by exploiting semi-supervised learning. Large-scale labeled data and deep learning methods have greatly improved the performance of image recognition. However, the performance of FER is still not ideal due to the lack of training data and incorrect annotations (e.g., label noises). Among existing in-the-wild FER datasets, reliable ones contain insufficient data to train robust deep models while large-scale ones are annotated in lower quality. To address this problem, we propose a semi-supervised learning algorithm named Progressive Teacher (PT) to utilize reliable FER datasets as well as large-scale unlabeled expression images for effective training. On the one hand, PT introduces semi-supervised learning method to relieve the shortage of data in FER. On the other hand, it selects useful labeled training samples automatically and progressively to alleviate label noise. PT uses selected clean labeled data for computing the supervised classification loss and unlabeled data for unsupervised consistency loss. Experiments on widely-used databases RAF-DB and FERPlus validate the effectiveness of our method, which achieves state-of-the-art performance with accuracy of 89.57% on RAF-DB. Additionally, when the synthetic noise rate reaches even 30%, the performance of our PT algorithm only degrades by 4.37%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源