论文标题

随机深神经网络的浓度不平等和最佳层数

Concentration inequalities and optimal number of layers for stochastic deep neural networks

论文作者

Caprio, Michele, Mukherjee, Sayan

论文摘要

我们陈述了随机深神经网络(SDNN)以及整个SDNN的输出的隐藏层输出的浓度不平等。这些结果使我们能够引入预期的分类器(EC),并为EC的分类误差提供概率上限。我们还通过最佳的停止过程陈述了SDNN的最佳层。我们将分析应用于具有Relu激活函数的前馈神经网络的随机版本。

We state concentration inequalities for the output of the hidden layers of a stochastic deep neural network (SDNN), as well as for the output of the whole SDNN. These results allow us to introduce an expected classifier (EC), and to give probabilistic upper bound for the classification error of the EC. We also state the optimal number of layers for the SDNN via an optimal stopping procedure. We apply our analysis to a stochastic version of a feedforward neural network with ReLU activation function.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源