Efficient approximation of high-dimensional functions with deep neural networks

12/09/2019
by   Patrick Cheridito, et al.
0

In this paper, we develop an approximation theory for deep neural networks that is based on the concept of a catalog network. Catalog networks are generalizations of standard neural networks in which the nonlinear activation functions can vary from layer to layer as long as they are chosen from a predefined catalog of continuous functions. As such, catalog networks constitute a rich family of continuous functions. We show that under appropriate conditions on the catalog, catalog networks can efficiently be approximated with neural networks and provide precise estimates on the number of parameters needed for a given approximation accuracy. We apply the theory of catalog networks to demonstrate that neural networks can overcome the curse of dimensionality in different high-dimensional approximation problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset