Recurrent neural networks have the potential to develop internal repre
sentations that anew useful encoding of the dynamics behind a sequence
of inputs. In this paper we present a model of time-varying probabili
ty distributions by using a two-layered columnar recurrent neural netw
ork in which each hidden unit has recurrent connections from the conte
xt units representing delayed outputs of the hidden unit. The probabil
ity distribution model can provide predictions in terms of a given pro
babilistic density function of the context units instead of the single
guess which is usually provided by Elman-type recurrent neural networ
ks. The advantage of this approach is the interpretability between the
context units and the dynamics behind the inputs. Computer simulation
s of a stochastic grammar and a discrete trend time-sequence are shown
to demonstrate the capability of the model.