Toward the optimal preconditioned eigensolver: Locally optimal block preconditioned conjugate gradient method

Authors
Citation
Av. Knyazev, Toward the optimal preconditioned eigensolver: Locally optimal block preconditioned conjugate gradient method, SIAM J SC C, 23(2), 2001, pp. 517-541
Citations number
44
Categorie Soggetti
Mathematics
Journal title
SIAM JOURNAL ON SCIENTIFIC COMPUTING
ISSN journal
10648275 → ACNP
Volume
23
Issue
2
Year of publication
2001
Pages
517 - 541
Database
ISI
SICI code
1064-8275(20010815)23:2<517:TTOPEL>2.0.ZU;2-1
Abstract
We describe new algorithms of the locally optimal block preconditioned conj ugate gradient (LOBPCG) method for symmetric eigenvalue problems, based on a local optimization of a three-term recurrence, and suggest several other new methods. To be able to compare numerically different methods in the cla ss, with different preconditioners, we propose a common system of model tes ts, using random preconditioners and initial guesses. As the "ideal" contro l algorithm, we advocate the standard preconditioned conjugate gradient met hod for finding an eigenvector as an element of the null-space of the corre sponding homogeneous system of linear equations under the assumption that t he eigenvalue is known. We recommend that every new preconditioned eigensol ver be compared with this "ideal" algorithm on our model test problems in t erms of the speed of convergence, costs of every iteration, and memory requ irements. We provide such comparison for our LOBPCG method. Numerical resul ts establish that our algorithm is practically as efficient as the "ideal" algorithm when the same preconditioner is used in both methods. We also sho w numerically that the LOBPCG method provides approximations to rst eigenpa irs of about the same quality as those by the much more expensive global op timization method on the same generalized block Krylov subspace. We propose a new version of block Davidson's method as a generalization of the LOBPCG method. Finally, direct numerical comparisons with the Jacobi Davidson met hod show that our method is more robust and converges almost two times fast er.