The present paper proposes a new learning model-called stochastic finite le
arning-and shows the whole class of pattern languages to be learnable withi
n this model.
This main result is achieved by providing a new and improved average-case a
nalysis of the Lange-Wiehagen (New Generation Computing, 8, 361-370) algori
thm learning the class of all pattern languages in the limit from positive
data. The complexity measure chosen is the total learning time, i.e., the o
verall time taken by the algorithm until convergence. The expectation of th
e total learning time is carefully analyzed and exponentially shrinking tai
l bounds for it are established for a large class of probability distributi
ons. For every pattern pi containing k different variables it is shown that
Lange and Wiehagen's algorithm possesses an expected total learning time o
f O(<(<alpha>) over cap>alpha (k) E[Lambda ]log(1/beta)(k)), where <(<alpha
>) over cap> and beta are two easily computable parameters arising naturall
y from the underlying probability distributions, and E[Lambda] is the expec
ted example string length.
Finally, assuming a bit of domain knowledge concerning the underlying class
of probability distributions, it is shown how to convert learning in the l
imit into stochastic finite learning.