The Monte Carlo wave-function method has recently proved to be an effi
cient tool in the analysis of linear dissipative quantum systems, i.e.
, systems with linear equations of motion for their density matrix. We
generalize this method to systems with nonlinear master equations of
a parametrized Lindblad form, which includes master equations obtained
by Hartree-Fock approximations. Convergence properties of the algorit
hm are discussed in detail. The method is illustrated by a numerical a
nalysis of the bosonic enhancement of laser cooling of trapped particl
es.