The paper suggests a new inference mechanism based on iterative use of the
Bayesian inference scheme. The procedure iteratively computes optimal compo
nent weights of a distribution mixture from a class called generalized know
ledge base. It is proved that the iterative process converges to a unique l
imit whereby the resulting probability distribution can be defined as the i
nformation-divergence projection of input distribution on the generalized k
nowledge base. The iterative inference mechanism resembles natural process
of cognition as iteratively improving understanding of the input informatio
n.