We propose a Bayesian framework for regression problems, which covers
areas usually dealt with by function approximation. An online learning
algorithm is derived which solves regression problems with a Kalman f
ilter. its solution always improves with increasing model complexity,
without the risk of over-fitting. In the infinite dimension limit it a
pproaches the hue Bayesian posterior. The issues of prior selection an
d over-fitting are also discussed, showing that some of the commonly h
eld beliefs are misleading. The practical implementation is summarised
. Simulations using 13 popular publicly available data sets are used t
o demonstrate the method and highlight important issues concerning the
choice of priors.