Acceleration of Descent-based Optimization Algorithms via Carathéodory's Theorem

06/02/2020
by   Francesco Cosentino, et al.
0

We propose a new technique to accelerate algorithms based on Gradient Descent using Carathéodory's Theorem. In the case of the standard Gradient Descent algorithm, we analyse the theoretical convergence of the approach under convexity assumptions and empirically display its ameliorations. As a core contribution, we then present an application of the acceleration technique to Block Coordinate Descent methods. Experimental comparisons on least squares regression with a LASSO regularisation term show remarkably improved performance on LASSO than the ADAM and SAG algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset