CONVERGENCE PROPERTIES OF HIGH-ORDER BOLTZMANN MACHINES

Citation
Fx. Albizuri et al., CONVERGENCE PROPERTIES OF HIGH-ORDER BOLTZMANN MACHINES, Neural networks, 9(9), 1996, pp. 1561-1567
Citations number
12
Categorie Soggetti
Mathematical Methods, Biology & Medicine","Computer Sciences, Special Topics","Computer Science Artificial Intelligence",Neurosciences,"Physics, Applied
Journal title
ISSN journal
08936080
Volume
9
Issue
9
Year of publication
1996
Pages
1561 - 1567
Database
ISI
SICI code
0893-6080(1996)9:9<1561:CPOHBM>2.0.ZU;2-9
Abstract
The high-order Boltzmann machine (HOBM) approximates probability distr ibutions defined on a set of binary variables, through a learning algo rithm that uses Monte Carlo methods. The approximation distribution is a normalized exponential of a consensus function formed by high-degre e terms and the structure of the HOBM is given by the set of weighted connections. We prove the convexity of the Kullback-Leibler divergence between the distribution to learn and the approximation distribution of the HOBM. We prove the convergence of the learning algorithm to the strict global minimum of the divergence, which corresponds to the max imum likelihood estimate of the connection weights, establishing the u niqueness of the solution. These theoretical results do not hold in th e conventional Boltzmann machine, where the consensus function has fir st and second-degree terms and hidden units are used. Copyright (C) 19 96 Elsevier Science Ltd.