We develop a framework employing scaling functions for the construction of
multistep quasi-Newton methods for unconstrained optimization. These method
s utilize values of the objective function. They are constructed via interp
olants of the m+1 most recent iterates/gradient evaluations, and possess a
free parameter which introduces an additional degree of flexibility. This p
ermits the interpolating functions to assimilate information, in the form o
f function-values, which is readily available at each iteration. Motivated
by previous experience [1] with the use of function-values in multistep met
hods, we investigate the incorporation of this information in the construct
ion of the Hessian approximation at each iteration, in an attempt to accele
rate convergence. We concentrate on a specific example from the general fam
ily of methods, corresponding to a particular choice of the scaling functio
n, and from it derive three new algorithms. The relative numerical performa
nce of these methods is assessed, and the most successful of them is then c
ompared with the standard BFGS method and with an earlier algorithm utilizi
ng function-values, also developed by the authors [1]. (C) 2001 Elsevier Sc
ience Ltd. All rights reserved.