At a given point (p) over bar, a convex function f is differentiable in a c
ertain subspace U (the subspace along which partial derivative f((p) over b
ar) has 0-breadth). This property opens the way to defining a suitably rest
ricted second derivative of f at (p) over bar. We do this via an intermedia
te function, convex on U. We call this function the U-Lagrangian; it coinci
des with the ordinary Lagrangian in composite cases: exact penalty, semidef
inite programming. Also, we use this new theory to design a conceptual patt
ern for superlinearly convergent minimization algorithms. Finally, we estab
lish a connection with the Moreau-Yosida regularization.