We study the self-scaling BFGS method of Oren and Luenberger (1974) fo
r solving unconstrained optimization problems. For general convex func
tions, we prove that the method is globally convergent with inexact li
ne searches. We also show that the directions generated by the self-sc
aling BFGS method approach Newton's direction asymptotically. This wou
ld ensure superlinear convergence if, in addition, the search directio
ns were well-scaled, but we show that this is not always the case. We
find that the method has a major drawback: to achieve superlinear conv
ergence it may be necessary to evaluate the function twice per iterati
on, even very near the solution. An example is constructed to show tha
t the step-sizes required to achieve a superlinear rate converge to 2
and 0.5 alternately.