A Theoretical Analysis of Sparse Recovery Stability of Dantzig Selector and LASSO
Dantzig selector (DS) and LASSO problems have attracted plenty of attention in statistical learning, sparse data recovery and mathematical optimization. In this paper, we provide a theoretical analysis of the sparse recovery stability of these optimization problems in more general settings and from a new perspective. We establish recovery error bounds for these optimization problems under a mild assumption called weak range space property of a transposed design matrix. This assumption is less restrictive than the well known sparse recovery conditions such as restricted isometry property (RIP), null space property (NSP) or mutual coherence. In fact, our analysis indicates that this assumption is tight and cannot be relaxed for the standard DS problems in order to maintain their sparse recovery stability. As a result, a series of new stability results for DS and LASSO have been established under various matrix properties, including the RIP with constant δ_2k< 1/√(2) and the (constant-free) standard NSP of order k. We prove that these matrix properties can yield an identical recovery error bound for DS and LASSO with stability coefficients being measured by the so-called Robinson's constant, instead of the conventional RIP or NSP constant. To our knowledge, this is the first time that the stability results with such a unified feature are established for DS and LASSO problems. Different from the standard analysis in this area of research, our analysis is carried out deterministically, and the key analytic tools used in our analysis include the error bound of linear systems due to Hoffman and Robinson and polytope approximation of symmetric convex bodies due to Barvinok.
READ FULL TEXT