Simultaneous Neural Network Approximations in Sobolev Spaces
We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. The error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously. Namely, for fβ C^s([0,1]^d), we show that deep ReLU networks of width πͺ(NlogN) and of depth πͺ(LlogL) can achieve a non-asymptotic approximation rate of πͺ(N^-2(s-1)/dL^-2(s-1)/d) with respect to the π²^1,p([0,1]^d) norm for pβ[1,β). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width πͺ(NlogN) and of depth πͺ(LlogL) to approximate fβ C^s([0,1]^d), the non-asymptotic approximation rate is πͺ(N^-2(s-n)/dL^-2(s-n)/d) with respect to the π²^n,p([0,1]^d) norm for pβ[1,β).
READ FULL TEXT