In this paper, a new gradient-based neural network is constructed on the ba
sis of the duality theory, optimization theory, convex analysis theory, Lya
punov stability, theory, and LaSalle invariance principle to solve linear a
nd quadratic programming problems. In particular, a new function F(x, y) is
introduced into the energy function E(x, y) such that the function E(x, y)
is convex and differentiable, and the resulting network is more efficient.
This network involves all the relevant necessary and sufficient optimality
, conditions for convex quadratic programming problems. For linear programm
ing (LP) and quadratic programming (QP) problems with unique and infinite n
umber of solutions, we have proven Strictly that for any initial point, eve
ry trajectory of the neural network converges to an optimal solution of the
QP and its dual problem. The proposed network is different from the existi
ng networks which use the penalty method or Lagrange method, and the inequa
lity (including nonnegativity) constraints are properly handled. The theory
of the proposed network is rigorous and the performance is much better. Th
e simulation results also show that the proposed neural network is feasible
and efficient.