Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions

03/19/2021
by   Patrick Cheridito, et al.
0

In this paper, we analyze the landscape of the true loss of a ReLU neural network with one hidden layer. We provide a complete classification of the critical points in the case where the target function is affine. In particular, we prove that local minima and saddle points have to be of a special form and show that there are no local maxima. Our approach is of a combinatorial nature and builds on a careful analysis of the different types of hidden neurons that can occur in a ReLU neural network.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset