Information-Theoretic Lower Bounds for Compressive Sensing with Generative Models

08/28/2019
by   Zhaoqiang Liu, et al.
0

The goal of standard compressive sensing is to estimate an unknown vector from linear measurements under the assumption of sparsity in some basis. Recently, it has been shown that significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption that the unknown vector lies near the range of a suitably-chosen generative model. In particular, in (Bora et al., 2017) it was shown that roughly O(klog L) random Gaussian measurements suffice for accurate recovery when the k-input generative model is bounded and L-Lipschitz, and that O(kd log w) measurements suffice for k-input ReLU networks with depth d and width w. In this paper, we establish corresponding algorithm-independent lower bounds on the sample complexity using tools from minimax statistical analysis. In accordance with the above upper bounds, our results are summarized as follows: (i) We construct an L-Lipschitz generative model capable of generating group-sparse signals, and show that the resulting necessary number of measurements is Ω(k log L); (ii) Using similar ideas, we construct two-layer ReLU networks of high width requiring Ω(k log w) measurements, as well as lower-width deep ReLU networks requiring Ω(k d) measurements. As a result, we establish that the scaling laws derived in (Bora et al., 2017) are optimal or near-optimal in the absence of further assumptions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset