We discuss a novel strategy for training neural networks using sequential M
onte Carlo algorithms and propose a new hybrid gradient descent/sampling im
portance resampling algorithm (HySIR). In terms of computational time and a
ccuracy, the hybrid SIR is a clear improvement over conventional sequential
Monte Carlo techniques. The new algorithm may be viewed as a global optimi
zation strategy that allows us to learn the probability distributions of th
e network weights and outputs in a sequential framework. It is well suited
to applications involving on-line, nonlinear, and nongaussian signal proces
sing. We show how the new algorithm outperforms extended Kalman filter trai
ning on several problems. In particular, we address the problem of pricing
option contracts, traded in financial markets. In this context, we are able
to estimate the one-step-ahead probability density functions of the option
s prices.