A Concise Tutorial on Approximate Message Passing

01/19/2022
by   Qiuyun Zou, et al.
0

High-dimensional signal recovery of standard linear regression is a key challenge in many engineering fields, such as, communications, compressed sensing, and image processing. The approximate message passing (AMP) algorithm proposed by Donoho et al is a computational efficient method to such problems, which can attain Bayes-optimal performance in independent identical distributed (IID) sub-Gaussian random matrices region. A significant feature of AMP is that the dynamical behavior of AMP can be fully predicted by a scalar equation termed station evolution (SE). Although AMP is optimal in IID sub-Gaussian random matrices, AMP may fail to converge when measurement matrix is beyond IID sub-Gaussian. To extend the region of random measurement matrix, an expectation propagation (EP)-related algorithm orthogonal AMP (OAMP) was proposed, which shares the same algorithm with EP, expectation consistent (EC), and vector AMP (VAMP). This paper aims at giving a review for those algorithms. We begin with the worst case, i.e. least absolute shrinkage and selection operator (LASSO) inference problem, and then give the detailed derivation of AMP derived from message passing. Also, in the Bayes-optimal setting, we give the Bayes-optimal AMP which has a slight difference from AMP for LASSO. In addition, we review some AMP-related algorithms: OAMP, VAMP, and Memory AMP (MAMP), which can be applied to more general random matrices.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset