Gradient Descent for Low-Rank Functions

06/16/2022
by   Romain Cosson, et al.
11

Several recent empirical studies demonstrate that important machine learning tasks, e.g., training deep neural networks, exhibit low-rank structure, where the loss function varies significantly in only a few directions of the input space. In this paper, we leverage such low-rank structure to reduce the high computational cost of canonical gradient-based methods such as gradient descent (GD). Our proposed Low-Rank Gradient Descent (LRGD) algorithm finds an ϵ-approximate stationary point of a p-dimensional function by first identifying r ≤ p significant directions, and then estimating the true p-dimensional gradient at every iteration by computing directional derivatives only along those r directions. We establish that the "directional oracle complexities" of LRGD for strongly convex and non-convex objective functions are 𝒪(r log(1/ϵ) + rp) and 𝒪(r/ϵ^2 + rp), respectively. When r ≪ p, these complexities are smaller than the known complexities of 𝒪(p log(1/ϵ)) and 𝒪(p/ϵ^2) of in the strongly convex and non-convex settings, respectively. Thus, LRGD significantly reduces the computational cost of gradient-based methods for sufficiently low-rank functions. In the course of our analysis, we also formally define and characterize the classes of exact and approximately low-rank functions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset