In the present work, we explore a general framework for the design of new m
inimization algorithms with desirable characteristics, namely, supervisor-s
earcher cooperation. We propose a class of algorithms within this framework
and examine a gradient algorithm in the class. Global convergence is estab
lished for the deterministic case in the absence of noise and the convergen
ce rate is studied. Both theoretical analysis and numerical tests show that
-the algorithm is efficient for the deterministic case. Furthermore, the fa
ct that there is no line search procedure incorporated in the algorithm see
ms to strengthen its robustness so that it tackles effectively test problem
s with stronger stochastic noises. The numerical results for both determini
stic and stochastic test problems illustrate the appealing attributes of th
e algorithm.