We propose a convex optimization approach to solving the nonparametric regr
ession estimation problem when the underlying regression function is Lipsch
itz continuous. This approach is based on the minimization of the sum of em
pirical squared errors, subject to the constraints implied by Lipschitz con
tinuity. The resulting optimization problem has a convex objective function
and linear constraints, and as a result, is efficiently solvable. The esti
mated function computed by this technique, is proven to convergeto the unde
rlying regression function uniformly and almost surely, when the sample siz
e grows to infinity, thus providing a very strong form of consistency. We a
lso propose a convex optimization approach to the maximum likelihood estima
tion of unknown parameters in statistical models, where the parameters depe
nd continuously on some observable input variables. For a number of classic
al distributional forms, the objective function in the underlying optimizat
ion problem is convex and the constraints are linear. These problems are, t
herefore, also efficiently solvable.