This paper provides several new properties of the nonlinear conjugate gradi
ent method in [5]. Firstly, the method is proved to have a certain self-adj
usting property that is independent of the line search and the function con
vexity. Secondly, under mild assumptions on the objective function, the met
hod is shown to be globally convergent with a variety of line searches. Thi
rdly, we find that instead of the negative gradient direction, the search d
irection defined by the nonlinear conjugate gradient method in [5] can be u
sed to restart any optimization method while guaranteeing the global conver
gence of the method. Some numerical results are also presented.