We consider conceptual optimization methods combining two ideas: the M
oreau-Yosida regularization in convex analysis, and quasi-Newton appro
ximations of smooth functions. We outline several approaches based on
this combination, and establish their global convergence. Then we stud
y theoretically the local convergence properties of one of these appro
aches, which uses quasi-Newton updates of the objective function itsel
f. Also, we obtain a globally and superlinearly convergent BFGS proxim
al method. At each step of our study, we single out the assumptions th
at are useful to derive the result concerned.