Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning

09/23/2019
by   Jérôme Bolte, et al.
0

The Clarke subdifferential is not suited to tackle nonsmooth deep learning issues: backpropagation, mini-batches and steady states are not properly modelled. As a remedy, we introduce set valued conservative fields as surrogates to standard subdifferential mappings. We study their properties and provide elements of a calculus. Functions having a conservative field are called path differentiable. Convex/concave, semi-algebraic, or Clarke regular Lipschitz continuous functions are path differentiable as their corresponding subdifferentials are conservative. Another concrete and considerable class of examples of conservative fields, which are not subdifferential mappings, is given by the automatic differentiation oracle, as for instance the "subgradients" provided by the backpropagation algorithm in deep learning. Our differential model is eventually used to ensure subsequential convergence for nonsmooth stochastic gradient methods in the tame Lipschitz continuous setting offering the possibility of using mini-batches, the actual backpropagation oracle and o(1/ k) stepsizes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset