Optimally Weighted PCA for High-Dimensional Heteroscedastic Data

10/30/2018
by   David Hong, et al.
0

Modern applications increasingly involve high-dimensional and heterogeneous data, e.g., datasets formed by combining numerous measurements from myriad sources. Principal Component Analysis (PCA) is a classical method for reducing dimensionality by projecting such data onto a low-dimensional subspace capturing most of their variation, but PCA does not robustly recover underlying subspaces in the presence of heteroscedastic noise. Specifically, PCA suffers from treating all data samples as if they are equally informative. This paper analyzes a weighted variant of PCA that accounts for heteroscedasticity by giving samples with larger noise variance less influence. The analysis provides expressions for the asymptotic recovery of underlying low-dimensional components from samples with heteroscedastic noise in the high-dimensional regime, i.e., for sample dimension on the order of the number of samples. Surprisingly, it turns out that whitening the noise by using inverse noise variance weights is sub-optimal. We derive optimal weights, characterize the performance of weighted PCA, and consider the problem of optimally collecting samples under budget constraints.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset