A First Order Free Lunch for SQRT-Lasso

05/25/2016
by   Xingguo Li, et al.
0

Many statistical machine learning techniques sacrifice convenient computational structures to gain estimation robustness and modeling flexibility. In this paper, we study this fundamental tradeoff through a SQRT-Lasso problem for sparse linear regression and sparse precision matrix estimation in high dimensions. We explain how novel optimization techniques help address these computational challenges. Namely, we propose a pathwise iterative smoothing shrinkage thresholding algorithm for solving the SQRT-Lasso optimization problem, and provide a novel model-based perspective for analyzing the smoothing optimization framework, which allows us to establish a near linear convergence (R-linear convergence) guarantee for our proposed algorithm, without sacrificing statistical accuracy. This implies that solving the SQRT-Lasso optimization problem is almost as easy as solving the Lasso optimization problem, while the former requires much less parameter tuning effort. Moreover, we show that our proposed algorithm can also be applied to sparse precision matrix estimation, and enjoys desirable computational as well as statistical properties. Numerical experiments are provided to support our theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset