STATISTICAL PHYSICS THEORY OF QUERY LEARNING BY AN ENSEMBLE OF HIGHER-ORDER NEURAL NETWORKS

Citation
G. Deco et D. Obradovic, STATISTICAL PHYSICS THEORY OF QUERY LEARNING BY AN ENSEMBLE OF HIGHER-ORDER NEURAL NETWORKS, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics, 52(2), 1995, pp. 1953-1957
Citations number
18
Categorie Soggetti
Physycs, Mathematical","Phsycs, Fluid & Plasmas
ISSN journal
1063651X
Volume
52
Issue
2
Year of publication
1995
Pages
1953 - 1957
Database
ISI
SICI code
1063-651X(1995)52:2<1953:SPTOQL>2.0.ZU;2-F
Abstract
Query learning aims to improve the generalization ability of a network that continuously learns by actively selecting nonredundant data, i.e ., data that contain new information about the process. In this paper, we formulate the problem of query learning in the statistical mechani cal framework. We define an information theoretic measure of the infor mativeness of the newly presented data in order to decide if the latte r should be used for the model update or not. Only the data that carry new information about the underlying process are selected for learnin g. The informativeness of the new data is defined as the Kullback-Leib ler distance between the likelihood of the a posteriori parameter dist ributions obtained before and after the inclusion of the new data poin t. In order to make the problem analytically solvable, we formulate th e theory for the ensemble of higher-order neural networks, i.e., for t he case of polynomial models. Comparison with other theoretical approa ches is included. Simulations that validate the proposed theory are al so included.