The conjugate gradient method with various viewpoints

10/08/2019
by   Xuping Zhang, et al.
0

Connections of the conjugate gradient (CG) method with other methods in computational mathematics are surveyed, including the connections with the conjugate direction method, the subspace optimization method and the quai-Newton method BFGS in numrical optimization, and the Lanczos method in numerical linear algebra. Two sequences of polynomials related to residual vectors and conjugate vectors are reviewed, where the residual polynomials are similar to orthogonal polynomials in the approximation theory and the roots of the polynomials reveal certain information of the coefficient matrix. The convergence rates of the steepest descent and CG are reconsidered in a viewpoint different from textbooks. The connection of infinite dimensional CG with finite dimensional preconditioned CG is also reviewed via numerical solution of an elliptic equation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset