论文标题
基于抽象的神经网络的输出范围分析
Abstraction based Output Range Analysis for Neural Networks
论文作者
论文摘要
在本文中,我们考虑了具有RELU激活功能的馈送前向神经网络的输出范围分析问题。现有方法将输出范围分析问题降低到可满足和优化解决,即NP障碍问题,其计算复杂性随着网络中的神经元数量而增加。为了解决计算复杂性,我们提出了一种新颖的抽象技术,该技术构建了一个更简单的神经网络,具有较少的神经元,尽管具有间隔称为间隔神经网络(INN)的间隔权重,从而使给定神经网络的输出范围过高。我们将旅馆的输出范围分析降低,以解决混合整数线性编程问题。我们的实验结果突出了计算时间和计算输出范围的精度之间的权衡。
In this paper, we consider the problem of output range analysis for feed-forward neural networks with ReLU activation functions. The existing approaches reduce the output range analysis problem to satisfiability and optimization solving, which are NP-hard problems, and whose computational complexity increases with the number of neurons in the network. To tackle the computational complexity, we present a novel abstraction technique that constructs a simpler neural network with fewer neurons, albeit with interval weights called interval neural network (INN), which over-approximates the output range of the given neural network. We reduce the output range analysis on the INNs to solving a mixed integer linear programming problem. Our experimental results highlight the trade-off between the computation time and the precision of the computed output range.