Relative loss bounds for on-line density estimation with the exponential family of distributions

Citation
Ks. Azoury et Mk. Warmuth, Relative loss bounds for on-line density estimation with the exponential family of distributions, MACH LEARN, 43(3), 2001, pp. 211-246
Citations number
45
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
MACHINE LEARNING
ISSN journal
08856125 → ACNP
Volume
43
Issue
3
Year of publication
2001
Pages
211 - 246
Database
ISI
SICI code
0885-6125(200106)43:3<211:RLBFOD>2.0.ZU;2-S
Abstract
We consider on-line density estimation with a parameterized density from th e exponential family. The on-line algorithm receives one example at a time and maintains a parameter that is essentially an average of the past exampl es. After receiving an example the algorithm incurs a loss, which is the ne gative log-likelihood of the example with respect to the current parameter of the algorithm. An off-line algorithm can choose the best parameter based on all the examples. We prove bounds on the additional total loss of the o n-line algorithm over the total loss of the best off-line parameter. These relative loss bounds hold for an arbitrary sequence of examples. The goal i s to design algorithms with the best possible relative loss bounds. We use a Bregman divergence to derive and analyze each algorithm. These divergence s are relative entropies between two exponential distributions. We also use our methods to prove relative loss bounds for linear regression.