Optimal Subspace Expansion for Matrix Eigenvalue Problems

04/10/2020
by   Zhongxiao Jia, et al.
0

In this paper, we consider the optimal subspace expansion problem for the matrix eigenvalue problem Ax=λ x: Which vector w in the current subspace V, after multiplied by A, provides an optimal subspace expansion for approximating a desired eigenvector x in the sense that x has the smallest angle with the expanded subspace V_w=V+ span{Aw}? Our research motivation is that many iterative methods construct nested subspaces that successively expands V to V_w. Ye (Linear Algebra Appl., 428 (2008), pp. 911–918) studies the maximization characterization of cosine between x and V_w but does not obtain the maximizer. He shows how to approximately maximize the cosine so as to find approximate solutions of the subspace expansion problem for A Hermitian. However, his approach and analysis cannot extend to the non-Hermitian case. We study the optimal expansion problem in the general case and derive explicit expressions of the optimal expansion vector w_opt. By a careful analysis on the theoretical results, we obtain computable nearly optimal choices of w_opt for the standard, harmonic and refined (harmonic) Rayleigh–Ritz methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro