Theoretical procedures are developed for comparing the performance of
arbitrarily selected admissible feedback controls among themselves wit
h the optimal solution of a nonlinear optimal stochastic control probl
em. Iterative design schemes are proposed for successively improving t
he performance of a controller until a satisfactory design is achieved
. Specifically, the exact design procedure is based on the generalized
Hamilton-Jacobi-Bellman equation of the cost function of nonlinear st
ochastic systems, and the approximate design procedure for the infinit
e-time nonlinear stochastic regulator problem, is developed by using t
he upper and lower bounds of the cost functions. Stability of this pro
blem is also considered. For a given controller, both the upper and lo
wer bounds to its cost function can be obtained by solving a partial d
ifferential inequality. These bounds, constructed without actually kno
wing the optimal controller, are used as measure to evaluate the accep
tability of suboptimal controllers. These results establish an approxi
mation theory of optimal stochastic control and provide a practical pr
ocedure for selecting effective practical controls for nonlinear stoch
astic systems. An Entropy reformulation of the Generalized Hamilton-Ja
cobi-Bellman equation is also presented.