Projection Efficient Subgradient Method and Optimal Nonsmooth Frank-Wolfe Method
We consider the classical setting of optimizing a nonsmooth Lipschitz continuous convex function over a convex constraint set, when having access to a (stochastic) first-order oracle (FO) for the function and a projection oracle (PO) for the constraint set. It is well known that to achieve ϵ-suboptimality in high-dimensions, Θ(ϵ^-2) FO calls are necessary. This is achieved by the projected subgradient method (PGD). However, PGD also entails O(ϵ^-2) PO calls, which may be computationally costlier than FO calls (e.g. nuclear norm constraints). Improving this PO calls complexity of PGD is largely unexplored, despite the fundamental nature of this problem and extensive literature. We present first such improvement. This only requires a mild assumption that the objective function, when extended to a slightly larger neighborhood of the constraint set, still remains Lipschitz and accessible via FO. In particular, we introduce MOPES method, which carefully combines Moreau-Yosida smoothing and accelerated first-order schemes. This is guaranteed to find a feasible ϵ-suboptimal solution using only O(ϵ^-1) PO calls and optimal O(ϵ^-2) FO calls. Further, instead of a PO if we only have a linear minimization oracle (LMO, a la Frank-Wolfe) to access the constraint set, an extension of our method, MOLES, finds a feasible ϵ-suboptimal solution using O(ϵ^-2) LMO calls and FO calls—both match known lower bounds, resolving a question left open since White (1993). Our experiments confirm that these methods achieve significant speedups over the state-of-the-art, for a problem with costly PO and LMO calls.
READ FULL TEXT