High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise

08/17/2022
by   Daniela A. Parletta, et al.
0

In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this case the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard subgradient methods enjoys high probability bounds. We analyzed a clipped version of the projected stochastic subgradient method, where subgradient estimates are truncated whenever they have large norms. We show that this clipping strategy leads both to near optimal any-time and finite horizon bounds for many classical averaging schemes. Preliminary experiments are shown to support the validity of the method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset