Diagonally-Dominant Principal Component Analysis

05/31/2019
by   Zheng Tracy Ke, et al.
0

We consider the problem of decomposing a large covariance matrix into the sum of a low-rank matrix and a diagonally dominant matrix, and we call this problem the "Diagonally-Dominant Principal Component Analysis (DD-PCA)". DD-PCA is an effective tool for designing statistical methods for strongly correlated data. We showcase the use of DD-PCA in two statistical problems: covariance matrix estimation, and global detection in multiple testing. Using the output of DD-PCA, we propose a new estimator for estimating a large covariance matrix with factor structure. Thanks to a nice property of diagonally dominant matrices, this estimator enjoys the advantage of simultaneous good estimation of the covariance matrix and the precision matrix (by a plain inversion). A plug-in of this estimator to linear discriminant analysis and portfolio optimization yields appealing performance in real data. We also propose two new tests for testing the global null hypothesis in multiple testing when the z-scores have a factor covariance structure. Both tests first use DD-PCA to adjust the individual p-values and then plug in the adjusted p-values to the Higher Criticism (HC) test. These new tests significantly improve over the HC test and compare favorably with other existing tests. For computation of DD-PCA, we propose an iterative projection algorithm and an ADMM algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset