From deep to Shallow: Equivalent Forms of Deep Networks in Reproducing Kernel Krein Space and Indefinite Support Vector Machines

07/15/2020
by   Alistair Shilton, et al.
0

In this paper we explore a connection between deep networks and learning in reproducing kernel Krein space. Our approach is based on the concept of push-forward - that is, taking a fixed non-linear transform on a linear projection and converting it to a linear projection on the output of a fixed non-linear transform, aka pushing the weights forward through the non-linearity. Applying this repeatedly from the input to the output of a deep network, the weights can be progressively "pushed" to the output layer, resulting in a flat network that has the form of a fixed non-linear map (whose form is determined by the structure of the deep network) followed by a linear projection determined by the weight matrices - that is, we take a deep network and convert it to an equivalent (indefinite) support vector machine. We then investigate the implications of this transformation for capacity control and generalisation, and provide a bound on generalisation error in the deep network in terms of generalisation error in reproducing kernel Krein space.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset