In this paper we consider several algorithms for reducing the storage when
using a quasi-Newton method in a dogleg-trust region setting for minimizing
functions of many variables. Secant methods require O(n(2)) locations to s
tore an approximate Hessian and O(n(2)) operations per iteration when minim
izing a function of n variables. This storage requirement becomes impractic
al when n becomes large. Our algorithms use a BFGS update and require kn st
orage and 4kn + O(k(2)) operations per iteration, but they may require more
iterations than the standard trust region techniques. Typically k is betwe
en 10 and 100. Our dogleg-trust region strategies involve expressions with
matrix products with both the inverse of this Hessian and with the Hessian
itself. Our techniques for updating expressions for the Hessian and its inv
erse can be used to improve the performance of line search, limited memory
algorithms.