Asymmetry Helps: Eigenvalue and Eigenvector Analyses of Asymmetrically Perturbed Low-Rank Matrices

11/30/2018
by   Yuxin Chen, et al.
0

This paper is concerned with a curious phenomenon in spectral estimation. Suppose we are interested in a rank-1 and symmetric matrix M^∈R^n× n, yet only a randomly perturbed version M is observed. The perturbation / noise matrix M-M^ is composed of independent and zero-mean entries and is not symmetric. This might arise, for example, when we have two independent samples for each entry of M^ and arrange them into an asymmetric data matrix M. The aim is to estimate the leading eigenvalue and eigenvector of M^. Somewhat unexpectedly, our findings reveal that the leading eigenvalue of the data matrix M can be √(n) times more accurate than its leading singular value in eigenvalue estimation. Further, the perturbation of any linear form of the leading eigenvector of M (e.g. entrywise eigenvector perturbation) is provably well-controlled. We further provide partial theory for the more general rank-r case; this allows us to accommodate the case when M^ is rank-1 but asymmetric, by considering eigen-decomposition of the associated rank-2 dilation matrix. The takeaway message is this: arranging the data samples in an asymmetric manner and performing eigen-decomposition (as opposed to SVD) could sometimes be quite beneficial.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro