A Maximum Likelihood-Based Minimum Mean Square Error Separation and Estimation of Stationary Gaussian Sources from Noisy Mixtures

10/22/2018
by   Amir Weiss, et al.
0

In the context of Independent Component Analysis (ICA), noisy mixtures pose a dilemma regarding the desired objective. On one hand, a "maximally separating" solution, providing the minimal attainable Interference-to-Source-Ratio (ISR), would often suffer from significant residual noise. On the other hand, optimal Minimum Mean Square Error (MMSE) estimation would yield estimates which are the "closest possible" to the true sources, often at the cost of compromised ISR. In this work, we consider noisy mixtures of temporally-diverse stationary Gaussian sources in a semi-blind scenario, which conveniently lends itself to either one of these objectives. We begin by deriving the ML Estimates (MLEs) of the unknown (deterministic) parameters of the model: the mixing matrix and the (possibly different) noise variances in each sensor. We derive the likelihood equations for these parameters, as well as the corresponding Cramér-Rao lower bound, and propose an iterative solution for obtaining the MLEs. Based on these MLEs, the asymptotically-optimal "maximally separating" solution can be readily obtained. However, we also present the ML-based MMSE estimate of the sources, alongside a frequency-domain-based computationally efficient scheme, exploiting their stationarity. We show that this estimate is asymptotically optimal and attains the (oracle) MMSE lower bound. Furthermore, for non-Gaussian signals, we show that this estimate serves as a Quasi ML (QML)-based Linear MMSE (LMMSE) estimate, and attains the (oracle) LMMSE lower bound asymptotically. Empirical results of three simulation experiments are presented, corroborating our analytical derivations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset