This paper presents a decomposition method for solving convex minimiza
tion problems. At each iteration, the algorithm computes two proximal
steps in the dual variables and one proximal step in the primal variab
les. We derive this algorithm from Rockafellar's proximal method of mu
ltipliers, which involves an augmented Lagrangian with an additional q
uadratic proximal term. The algorithm preserves the good features of t
he proximal method of multipliers, with the additional advantage that
it leads to a decoupling of the constraints, and is thus suitable for
parallel implementation. We allow for computing approximately the prox
imal minimization steps and we prove that under mild assumptions on th
e problem's data, the method is globally convergent and at a linear ra
te. The method is compared with alternating direction type methods and
applied to the particular case of minimizing a convex function over a
finite intersection of closed convex sets.