Simultaneous Neural Network Approximations in Sobolev Spaces

09/01/2021
āˆ™
by   Sean Hon, et al.
āˆ™
0
āˆ™

We establish in this work approximation results of deep neural networks for smooth functions measured in Sobolev norms, motivated by recent development of numerical solvers for partial differential equations using deep neural networks. The error bounds are explicitly characterized in terms of both the width and depth of the networks simultaneously. Namely, for fāˆˆ C^s([0,1]^d), we show that deep ReLU networks of width š’Ŗ(NlogN) and of depth š’Ŗ(LlogL) can achieve a non-asymptotic approximation rate of š’Ŗ(N^-2(s-1)/dL^-2(s-1)/d) with respect to the š’²^1,p([0,1]^d) norm for pāˆˆ[1,āˆž). If either the ReLU function or its square is applied as activation functions to construct deep neural networks of width š’Ŗ(NlogN) and of depth š’Ŗ(LlogL) to approximate fāˆˆ C^s([0,1]^d), the non-asymptotic approximation rate is š’Ŗ(N^-2(s-n)/dL^-2(s-n)/d) with respect to the š’²^n,p([0,1]^d) norm for pāˆˆ[1,āˆž).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset