论文标题

Reface:对面部识别系统的实时对抗攻击

ReFace: Real-time Adversarial Attacks on Face Recognition Systems

论文作者

Hussain, Shehzeen, Huster, Todd, Mesterharm, Chris, Neekhara, Paarth, An, Kevin, Jere, Malhar, Sikka, Harshvardhan, Koushanfar, Farinaz

论文摘要

基于深度神经网络的面部识别模型已被证明容易受到对抗例子的影响。但是,过去的许多攻击都要求对手使用梯度下降来解决输入依赖性优化问题,这使该攻击实时不切实际。这些对抗性示例也与攻击模型紧密结合,并且在转移到不同模型方面并不那么成功。在这项工作中,我们提出了Reface,这是对基于对抗性转换网络(ATN)的面部识别模型的实时,高度转移的攻击。 ATNS模型对抗性示例生成是馈送前向神经网络。我们发现,纯U-NET ATN的白盒攻击成功率在大面部识别数据集上的基于梯度的攻击之类的较低。因此,我们为ATN提出了一个新的体系结构,该体系结构缩小了这一差距,同时维持PGD的10000倍加速。此外,我们发现在给定的扰动幅度下,与PGD相比,我们的ATN对抗扰动在转移到新的面部识别模型方面更有效。 Reface攻击可以在转移攻击环境中成功欺骗商业面部识别服务,并将面部识别精度从82%降低到AWS SearchFaces API的82%,而Azure Face验证精度从91%到50.1%。

Deep neural network based face recognition models have been shown to be vulnerable to adversarial examples. However, many of the past attacks require the adversary to solve an input-dependent optimization problem using gradient descent which makes the attack impractical in real-time. These adversarial examples are also tightly coupled to the attacked model and are not as successful in transferring to different models. In this work, we propose ReFace, a real-time, highly-transferable attack on face recognition models based on Adversarial Transformation Networks (ATNs). ATNs model adversarial example generation as a feed-forward neural network. We find that the white-box attack success rate of a pure U-Net ATN falls substantially short of gradient-based attacks like PGD on large face recognition datasets. We therefore propose a new architecture for ATNs that closes this gap while maintaining a 10000x speedup over PGD. Furthermore, we find that at a given perturbation magnitude, our ATN adversarial perturbations are more effective in transferring to new face recognition models than PGD. ReFace attacks can successfully deceive commercial face recognition services in a transfer attack setting and reduce face identification accuracy from 82% to 16.4% for AWS SearchFaces API and Azure face verification accuracy from 91% to 50.1%.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源