DEVELOPMENT OF LOW ENTROPY CODING IN A RECURRENT NETWORK

Citation
Gf. Harpur et Rw. Prager, DEVELOPMENT OF LOW ENTROPY CODING IN A RECURRENT NETWORK, Network, 7(2), 1996, pp. 277-284
Citations number
15
Categorie Soggetti
Mathematical Methods, Biology & Medicine",Neurosciences,"Engineering, Eletrical & Electronic","Computer Science Artificial Intelligence
Journal title
ISSN journal
0954898X
Volume
7
Issue
2
Year of publication
1996
Pages
277 - 284
Database
ISI
SICI code
0954-898X(1996)7:2<277:DOLECI>2.0.ZU;2-6
Abstract
In this paper we present an unsupervised neural network which exhibits competition between units via inhibitory feedback. The operation is s uch as to minimize reconstruction error, both for individual patterns, and over the entire training set. A key difference from networks whic h perform principal components analysis, or one of its variants, is th e ability to converge to non-orthogonal weight values. We discuss the network's operation in relation to the twin goals of maximizing inform ation transfer and minimizing code entropy, and show how the assignmen t of prior probabilities to network outputs can help to reduce entropy . We present results from two binary coding problems, and from experim ents with image coding.