SMOOTH ONLINE LEARNING ALGORITHMS FOR HIDDEN MARKOV-MODELS

Authors
Citation
P. Baldi et Y. Chauvin, SMOOTH ONLINE LEARNING ALGORITHMS FOR HIDDEN MARKOV-MODELS, Neural computation, 6(2), 1994, pp. 307-318
Citations number
15
Categorie Soggetti
Computer Sciences","Computer Science Artificial Intelligence",Neurosciences
Journal title
ISSN journal
08997667
Volume
6
Issue
2
Year of publication
1994
Pages
307 - 318
Database
ISI
SICI code
0899-7667(1994)6:2<307:SOLAFH>2.0.ZU;2-Q
Abstract
A simple learning algorithm for Hidden Markov Models (HMMs) is present ed together with a number of variations. Unlike other classical algori thms such as the Baum-Welch algorithm, the algorithms described are sm ooth and can be used on-line (after each example presentation) or in b atch mode, with or without the usual Viterbi most likely path approxim ation. The algorithms have simple expressions that result from using a normalized-exponential representation for the HMM parameters. All the algorithms presented are proved to be exact or approximate gradient o ptimization algorithms with respect to likelihood, log-likelihood, or cross-entropy functions, and as such are usually convergent. These alg orithms can also be casted in the more general EM (Expectation-Maximiz ation) framework where they can be viewed as exact or approximate GEM (Generalized Expectation-Maximization) algorithms. The mathematical pr operties of the algorithms are derived in the appendix.