Simultaneous Neural Network Approximations in Sobolev Spaces
We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. The error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously. Namely, for fā C^s([0,1]^d), we show that deep ReLU networks of width šŖ(NlogN) and of depth šŖ(LlogL) can achieve a non-asymptotic approximation rate of šŖ(N^-2(s-1)/dL^-2(s-1)/d) with respect to the š²^1,p([0,1]^d) norm for pā[1,ā). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width šŖ(NlogN) and of depth šŖ(LlogL) to approximate fā C^s([0,1]^d), the non-asymptotic approximation rate is šŖ(N^-2(s-n)/dL^-2(s-n)/d) with respect to the š²^n,p([0,1]^d) norm for pā[1,ā).
READ FULL TEXT