论文标题

使用Imagestars验证深卷积神经网络

Verification of Deep Convolutional Neural Networks Using ImageStars

论文作者

Tran, Hoang-Dung, Bak, Stanley, Xiang, Weiming, Johnson, Taylor T.

论文摘要

卷积神经网络(CNN)已重新定义了许多现实世界中最新的应用,例如面部识别,图像分类,人体姿势估计和语义分割。尽管他们成功了,但CNN很容易受到对抗性攻击的影响,在这种情况下,他们的投入略有变化可能会导致其在训练有素的网络中的产出急剧变化。基于集合的分析方法可以检测或证明没有有界的对抗攻击,然后可以用来评估神经网络训练方法的有效性。不幸的是,现有的验证方法在可以分析的网络大小方面具有有限的可伸缩性。 在本文中,我们描述了一个基于集合的框架,该框架成功地处理了现实世界中的CNN,例如VGG16和VGG19,这些框架在Imagenet上具有很高的精度。我们的方法基于一种称为Imagestar的新集表示,该表示可以有效地精确,过度鉴定CNN。 Imagestars通过将混凝土图像上的操作与线性编程(LP)相结合,进行有效的基于集合的分析。我们的方法是在一种名为NNV的工具中实现的,可以验证VGG网络相对于一系列输入状态的鲁棒性,这些输入状态来自对抗性攻击,例如DeepFool攻击。实验结果表明,我们的方法不如现有的扎根方法(例如在DeepZ中使用的方法)以及在Deeppoly中使用的多层方法。

Convolutional Neural Networks (CNN) have redefined the state-of-the-art in many real-world applications, such as facial recognition, image classification, human pose estimation, and semantic segmentation. Despite their success, CNNs are vulnerable to adversarial attacks, where slight changes to their inputs may lead to sharp changes in their output in even well-trained networks. Set-based analysis methods can detect or prove the absence of bounded adversarial attacks, which can then be used to evaluate the effectiveness of neural network training methodology. Unfortunately, existing verification approaches have limited scalability in terms of the size of networks that can be analyzed. In this paper, we describe a set-based framework that successfully deals with real-world CNNs, such as VGG16 and VGG19, that have high accuracy on ImageNet. Our approach is based on a new set representation called the ImageStar, which enables efficient exact and over-approximative analysis of CNNs. ImageStars perform efficient set-based analysis by combining operations on concrete images with linear programming (LP). Our approach is implemented in a tool called NNV, and can verify the robustness of VGG networks with respect to a small set of input states, derived from adversarial attacks, such as the DeepFool attack. The experimental results show that our approach is less conservative and faster than existing zonotope methods, such as those used in DeepZ, and the polytope method used in DeepPoly.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源