ARTIFICIAL MEMORIES - CAPACITY, BASIS RATE AND INFERENCE

Authors
Citation
P. Whittle, ARTIFICIAL MEMORIES - CAPACITY, BASIS RATE AND INFERENCE, Neural networks, 10(9), 1997, pp. 1619-1626
Citations number
30
Journal title
ISSN journal
08936080
Volume
10
Issue
9
Year of publication
1997
Pages
1619 - 1626
Database
ISI
SICI code
0893-6080(1997)10:9<1619:AM-CBR>2.0.ZU;2-9
Abstract
We study associative and storage memories for memory traces of size N and aim to establish that both the size of the system (as measured by, e.g., the number of nodes in a network) can be of order N and the num ber of traces consistent with reliable operation can be exponentially large in N, so that a positive capacity (in bits per node) can be achi eved. It is well known that, if the traces are generated as M random v ectors, then reliability imposes a linear bound on M, in that it impli es an upper bound on the asymptotic (large N) value of alpha = M/N. Fo r the noise-free Hopfield net this critical bound is about 0.138. We s how that, if superposition of traces is allowed, so that the M given t races constitute the random basis of a linear code, then exponential m emory size and a positive capacity can be achieved. However, there is still a critical upper bound on the basis rate alpha = M/N, implied no w, not by the condition of reliability: but by the necessity that the recursion realising the calculation should be stable. For our model we determine this critical value exactly as alpha(c)=3-root 8=0.172. Our model is based upon inference concepts and differs in slight but impo rtant respects from the Hopfield model. We do not use replica methods, but appeal to a generalised version of the Wigner semi-circle theorem on the asymptotic distribution of eigenvalues. (C) 1997 Elsevier Scie nce Ltd. All rights reserved.