Stochastic Smoothing for Nonsmooth Minimizations: Accelerating SGD by Exploiting Structure

05/21/2012
by   Hua Ouyang, et al.
0

In this work we consider the stochastic minimization of nonsmooth convex loss functions, a central problem in machine learning. We propose a novel algorithm called Accelerated Nonsmooth Stochastic Gradient Descent (ANSGD), which exploits the structure of common nonsmooth loss functions to achieve optimal convergence rates for a class of problems including SVMs. It is the first stochastic algorithm that can achieve the optimal O(1/t) rate for minimizing nonsmooth loss functions (with strong convexity). The fast rates are confirmed by empirical comparisons, in which ANSGD significantly outperforms previous subgradient descent algorithms including SGD.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset