We study the probabilistic generative models parameterized by feedforward n
eural networks. An attractor dynamics for probabilistic inference in these
models is derived from a mean field approximation for large, layered sigmoi
dal networks. Fixed points of the dynamics correspond to solutions of the m
ean field equations, which relate the statistics of each unit to those of i
ts Markov blanket. We establish global convergence of the dynamics by provi
ding a Lyapunov function and show that the dynamics generate the signals re
quired for unsupervised learning. Our results for feed forward networks pro
vide a counterpart to those of Cohen-Grossberg and Hopfield for symmetric n
etworks.