N. Pican et F. Alexandre, HOW OWE ARCHITECTURES ENCODE CONTEXTUAL EFFECTS IN ARTIFICIAL NEURAL NETWORKS, Mathematics and computers in simulation, 41(1-2), 1996, pp. 63-74
Artificial neural networks (ANNs) are widely used for classification t
asks where discriminant cues and also contextual parameters are propos
ed as ANN inputs. When the input space is too large to enable a robust
, time limited learning, a classical solution consists in designing a
set of ANNs for different context domains. We have proposed a new lear
ning algorithm, the lateral contribution learning algorithm (LCLA), ba
sed on the backpropagation learning algorithm, which allows for such a
solution with a reduced learning time and more efficient performances
thanks to lateral influences between networks. This attractive, but h
eavy solution has been improved thanks to the orthogonal weight estima
tor (OWE) technique, an original architectural technique which, under
light constraints, merges the set of ANNs in one ANN whose weights are
dynamically estimated, for each example, by others ANNs, fed by the c
ontext. This architecture allows to give a very rich and interesting i
nterpretation of the weight landscape. We illustrate this interpretati
on with two examples: a mathematical function estimation and a process
modelization used in neurocontrol.