Information complexity of neural networks

Citation
Ma. Kon et L. Plaskota, Information complexity of neural networks, NEURAL NETW, 13(3), 2000, pp. 365-375
Citations number
18
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
NEURAL NETWORKS
ISSN journal
08936080 → ACNP
Volume
13
Issue
3
Year of publication
2000
Pages
365 - 375
Database
ISI
SICI code
0893-6080(200004)13:3<365:ICONN>2.0.ZU;2-G
Abstract
This paper studies the question of lower bounds on the number of neurons an d examples necessary to program a given task into feedforward neural networ ks. We introduce the notion of information complexity of a network to compl ement that of neural complexity. Neural complexity deals with lower bounds for neural resources (numbers of neurons) needed by a network to perform a given task within a given tolerance. Information complexity measures lower bounds for the information (i.e. number of examples) needed about the desir ed input-output function. We study the interaction of the two complexities, and so lower bounds for the complexity of building and then programming fe ed-forward nets for given tasks. We show something unexpected a priori-the interaction of the two can be simply bounded, so that they can be studied e ssentially independently. We construct radial basis function (RBF) algorith ms of order n(3) that are information-optimal, and give example application s. (C) 2000 Elsevier Science Ltd. all rights reserved.