The high-order Boltzmann machine (HOBM) approximates probability distr
ibutions defined on a set of binary variables, through a learning algo
rithm that uses Monte Carlo methods. The approximation distribution is
a normalized exponential of a consensus function formed by high-degre
e terms and the structure of the HOBM is given by the set of weighted
connections. We prove the convexity of the Kullback-Leibler divergence
between the distribution to learn and the approximation distribution
of the HOBM. We prove the convergence of the learning algorithm to the
strict global minimum of the divergence, which corresponds to the max
imum likelihood estimate of the connection weights, establishing the u
niqueness of the solution. These theoretical results do not hold in th
e conventional Boltzmann machine, where the consensus function has fir
st and second-degree terms and hidden units are used. Copyright (C) 19
96 Elsevier Science Ltd.