论文标题

高维近似中的深度恢复神经网络

Deep ReLU neural networks in high-dimensional approximation

论文作者

Dũng, Dinh, Nguyen, Van Kien

论文摘要

我们研究了深度relu(整流线性单元)神经网络的计算复杂性,以近似在$ d $维单元立方体上定义的混合平滑度的Hölder-Zygmund空间的函数近似。在各向同性Sobolev空间的规范中测量近似误差。对于每个功能$ f $从Hölder-Zygmund的混合光滑度中的空间中,我们明确构建了一个深层的恢复神经网络,其输出具有近似于$ f $的$ f $ $ \ varepsilon $,并证明尺寸与此近似$相关的$ netural $ netuly $ nectime $ netuly $ nectiment $ nectime $ nectime $ necip $ necip $ nectime $ deplus $ necip $ necip $ necip $ necip $ necip的$ f varepsilon $ appectiage $ f $ f $ f $ f $ f $ f。 $ \ varepsilon $。这些结果的证明特别是取决于基于Faber系列的稀疏网格采样恢复的近似值。

We study the computation complexity of deep ReLU (Rectified Linear Unit) neural networks for the approximation of functions from the Hölder-Zygmund space of mixed smoothness defined on the $d$-dimensional unit cube when the dimension $d$ may be very large. The approximation error is measured in the norm of isotropic Sobolev space. For every function $f$ from the Hölder-Zygmund space of mixed smoothness, we explicitly construct a deep ReLU neural network having an output that approximates $f$ with a prescribed accuracy $\varepsilon$, and prove tight dimension-dependent upper and lower bounds of the computation complexity of this approximation, characterized as the size and the depth of this deep ReLU neural network, explicitly in $d$ and $\varepsilon$. The proof of these results are in particular, relied on the approximation by sparse-grid sampling recovery based on the Faber series.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源