We describe new algorithms of the locally optimal block preconditioned conj
ugate gradient (LOBPCG) method for symmetric eigenvalue problems, based on
a local optimization of a three-term recurrence, and suggest several other
new methods. To be able to compare numerically different methods in the cla
ss, with different preconditioners, we propose a common system of model tes
ts, using random preconditioners and initial guesses. As the "ideal" contro
l algorithm, we advocate the standard preconditioned conjugate gradient met
hod for finding an eigenvector as an element of the null-space of the corre
sponding homogeneous system of linear equations under the assumption that t
he eigenvalue is known. We recommend that every new preconditioned eigensol
ver be compared with this "ideal" algorithm on our model test problems in t
erms of the speed of convergence, costs of every iteration, and memory requ
irements. We provide such comparison for our LOBPCG method. Numerical resul
ts establish that our algorithm is practically as efficient as the "ideal"
algorithm when the same preconditioner is used in both methods. We also sho
w numerically that the LOBPCG method provides approximations to rst eigenpa
irs of about the same quality as those by the much more expensive global op
timization method on the same generalized block Krylov subspace. We propose
a new version of block Davidson's method as a generalization of the LOBPCG
method. Finally, direct numerical comparisons with the Jacobi Davidson met
hod show that our method is more robust and converges almost two times fast
er.