Regularisation of Neural Networks by Enforcing Lipschitz Continuity

04/12/2018
by   Henry Gouk, et al.
0

We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks. Our main hypothesis is that constraining the Lipschitz constant of a networks will have a regularising effect. To this end, we provide a simple technique for computing the Lipschitz constant of a feed forward neural network composed of commonly used layer types. This technique is then utilised to formulate training a Lipschitz continuous neural network as a constrained optimisation problem, which can be easily solved using projected stochastic gradient methods. Our evaluation study shows that, in isolation, our method performs comparatively to state-of-the-art regularisation techniques. Moreover, when combined with existing approaches to regularising neural networks the performance gains are cumulative.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset