Compressed Sensing with Adversarial Sparse Noise via L1 Regression
We present a simple and effective algorithm for the problem of sparse robust linear regression. In this problem, one would like to estimate a sparse vector w^* ∈R^n from linear measurements corrupted by sparse noise that can arbitrarily change an adversarially chosen η fraction of measured responses y, as well as introduce bounded norm noise to the responses. For Gaussian measurements, we show that a simple algorithm based on L1 regression can successfully estimate w^* for any η < η_0 ≈ 0.239, and that this threshold is tight for the algorithm. The number of measurements required by the algorithm is O(k n/k) for k-sparse estimation, which is within constant factors of the number needed without any sparse noise. Of the three properties we show---the ability to estimate sparse, as well as dense, w^*; the tolerance of a large constant fraction of outliers; and tolerance of adversarial rather than distributional (e.g., Gaussian) dense noise---to the best of our knowledge, no previous polynomial time algorithm was known to achieve more than two.
READ FULL TEXT