Recent results on the memory storage capacity of higher order neural n
etworks indicate a significant improvement compared to the limited cap
acity of the Hopfield model, However, such results have so far been ob
tained under the restriction that only a single iteration is allowed t
o converge. This paper presents a nondirect convergence (long-term att
raction) analysis of higher order neural networks, Our main result is
that for any kappa(d) < d!2(d-1)/(2d)!, and 0 less than or equal to rh
o < 1/2, a Hebbian higher order neural network of order d with n neuro
ns can store a random set of kappa(d)n(d)/log n fundamental memories s
uch that almost all memories have an attraction radius of size rho n.
If kappa(d) < d!2(d-1)/((2d)!(d+1)), then all memories possess this pr
operty simultaneously, It indicates that the lower bounds on the long-
term attraction capacities are larger than the corresponding direct co
nvergence capacities by a factor of 1/(1-2 rho)(2d). In addition we up
per bound the convergence rate (number of iterations required to conve
rge). This bound is asymptotically independent of n, Similar results a
re obtained for zero diagonal higher order neural networks.