Chaotic resonance - Methods and applications for robust classification of noisy and variable patterns

Citation
R. Kozma et Wj. Freeman, Chaotic resonance - Methods and applications for robust classification of noisy and variable patterns, INT J B CH, 11(6), 2001, pp. 1607-1629
Citations number
65
Categorie Soggetti
Multidisciplinary
Journal title
INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS
ISSN journal
02181274 → ACNP
Volume
11
Issue
6
Year of publication
2001
Pages
1607 - 1629
Database
ISI
SICI code
0218-1274(200106)11:6<1607:CR-MAA>2.0.ZU;2-Y
Abstract
A fundamental tenet of the theory of deterministic chaos holds that infinit esimal variation in the initial conditions of a network that is operating i n the basin of a low-dimensional chaotic attractor causes the various traje ctories to diverge from each other quickly. This "sensitivity to initial co nditions" might seem to hold promise for signal detection, owing to an impl ied capacity for distinguishing small differences in patterns. However, thi s sensitivity is incompatible with pattern classification, because it ampli fies irrelevant differences in incomplete patterns belonging to the same cl ass, and it renders the network easily corrupted by noise. Here a theory of stochastic chaos is developed, in which aperiodic outputs with 1/f(2) spec tra are formed by the interaction of globally connected nodes that are indi vidually governed by point attractors under perturbation by continuous whit e noise. The interaction leads to a high-dimensional global chaotic attract or that governs the entire array of nodes. An example is our spatially dist ributed KIII network that is derived from studies of the olfactory system, and that is stabilized by additive noise modeled on biological noise source s. Systematic parameterization of the interaction strengths corresponding t o synaptic gains among nodes representing excitatory and inhibitory neuron populations enables the formation of a robust high-dimensional global chaot ic attractor. Reinforcement learning from examples of patterns to be classi fied using habituation and association creates lower dimensional local basi ns, which form a global attractor landscape with one basin for each class. Thereafter, presentation of incomplete examples of a test pattern leads to confinement of the KIII network in the basin corresponding to that pattern, which constitutes many-to-one generalization. The capture after learning i s expressed by a stereotypical spatial pattern of amplitude modulation of a chaotic carrier wave. Sensitivity to initial conditions is no longer an is sue. Scaling of the additive noise as a parameter optimizes the classificat ion of data sets in a manner that is comparable to stochastic resonance. Th e local basins constitute dynamical memories that solve difficult problems in classifying data sets that are not linearly separable. New local basins can be added quickly from very few examples without loss of existing basins . The attractor landscape enables the KIII set to provide an interface betw een noisy, unconstrained environments and conventional pattern classifiers. Examples given here of its robust performance include fault detection in s mall machine parts and the classification of spatioternporal EEG patterns f rom rabbits trained to discriminate visual stimuli.