Error bounds for approximations with deep ReLU neural networks in W^s,p norms

02/21/2019
by   Ingo Gühring, et al.
0

We analyze approximation rates of deep ReLU neural networks for Sobolev-regular functions with respect to weaker Sobolev norms. First, we construct, based on a calculus of ReLU networks, artificial neural networks with ReLU activation functions that achieve certain approximation rates. Second, we establish lower bounds for the approximation by ReLU neural networks for classes of Sobolev-regular functions. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset