LONG-TERM ATTRACTION IN HIGHER-ORDER NEURAL NETWORKS

Authors
Citation
D. Burshtein, LONG-TERM ATTRACTION IN HIGHER-ORDER NEURAL NETWORKS, IEEE transactions on neural networks, 9(1), 1998, pp. 42-50
Citations number
20
Categorie Soggetti
Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods","Computer Science Artificial Intelligence","Computer Science Hardware & Architecture","Computer Science Theory & Methods
ISSN journal
10459227
Volume
9
Issue
1
Year of publication
1998
Pages
42 - 50
Database
ISI
SICI code
1045-9227(1998)9:1<42:LAIHNN>2.0.ZU;2-F
Abstract
Recent results on the memory storage capacity of higher order neural n etworks indicate a significant improvement compared to the limited cap acity of the Hopfield model, However, such results have so far been ob tained under the restriction that only a single iteration is allowed t o converge. This paper presents a nondirect convergence (long-term att raction) analysis of higher order neural networks, Our main result is that for any kappa(d) < d!2(d-1)/(2d)!, and 0 less than or equal to rh o < 1/2, a Hebbian higher order neural network of order d with n neuro ns can store a random set of kappa(d)n(d)/log n fundamental memories s uch that almost all memories have an attraction radius of size rho n. If kappa(d) < d!2(d-1)/((2d)!(d+1)), then all memories possess this pr operty simultaneously, It indicates that the lower bounds on the long- term attraction capacities are larger than the corresponding direct co nvergence capacities by a factor of 1/(1-2 rho)(2d). In addition we up per bound the convergence rate (number of iterations required to conve rge). This bound is asymptotically independent of n, Similar results a re obtained for zero diagonal higher order neural networks.