Bilinear Compressed Sensing under known Signs via Convex Programming

06/25/2019
by   Alireza Aghasi, et al.
0

We consider the bilinear inverse problem of recovering two vectors, x∈R^L and w∈R^L, from their entrywise product. We consider the case where x and w have known signs and are sparse with respect to known dictionaries of size K and N, respectively. Here, K and N may be larger than, smaller than, or equal to L. We introduce ℓ_1-BranchHull, which is a convex program posed in the natural parameter space and does not require an approximate solution or initialization in order to be stated or solved. Under the assumptions that x and w satisfy a comparable-effective-sparsity condition and are S_1- and S_2-sparse with respect to a random dictionary, we present a recovery guarantee in a noisy case. We show that ℓ_1-BranchHull is robust to small dense noise with high probability if the number of measurements satisfy L≥Ω((S_1+S_2)^2(K+N)). Numerical experiments show that the scaling constant in the theorem is not too large. We also introduce variants of ℓ_1-BranchHull for the purposes of tolerating noise and outliers, and for the purpose of recovering piecewise constant signals. We provide an ADMM implementation of these variants and show they can extract piecewise constant behavior from real images.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset