The convergence behaviour of a number of algorithms based on minimizing res
idual norms over Krylov subspaces is not well understood. Residual or error
bounds currently available are either too loose or depend on unknown const
ants that can be very large. In this paper we take another look at traditio
nal as well as alternative ways of obtaining upper bounds on residual norms
. In particular, we derive inequalities that utilize Chebyshev polynomials
and compare them with standard inequalities. Copyright (C) 2000 John Wiley
& Sons, Ltd.