Memory Approximate Message Passing

12/20/2020
by   Lei Liu, et al.
0

Approximate message passing (AMP) is a low-cost iterative parameter-estimation technique for certain high-dimensional linear systems with non-Gaussian distributions. However, AMP only applies to the independent identically distributed (IID) transform matrices, but may become unreliable (e.g. perform poorly or even diverge) for other matrix ensembles, especially for ill-conditioned ones. To handle this difficulty, orthogonal/vector AMP (OAMP/VAMP) was proposed for general bi-unitarily-invariant matrices. However, the Bayes-optimal OAMP/VAMP requires high-complexity linear minimum mean square error (MMSE) estimator. This limits the application of OAMP/VAMP to large-scale systems. To solve the disadvantages of AMP and OAMP/VAMP, this paper proposes a low-complexity memory AMP (MAMP) for unitarily-invariant matrices. MAMP is consisted of an orthogonal non-linear estimator (NLE) for denoising (same as OAMP/VAMP), and an orthogonal long-memory matched filter (MF) for interference suppression. Orthogonal principle is used to guarantee the asymptotic Gaussianity of estimation errors in MAMP. A state evolution is derived to asymptotically characterize the performance of MAMP. The relaxation parameters and damping vector in MAMP are analytically optimized based on the state evolution to guarantee and improve the convergence. MAMP has comparable complexity to AMP. Furthermore, for all unitarily-invariant matrices, the optimized MAMP converges to the high-complexity OAMP/VAMP, and thus is Bayes-optimal if it has a unique fixed point. Finally, simulations are provided to verify the validity and accuracy of the theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset