The "weight smoothing" regularization of MLP for Jacobian stabilization

Citation
F. Aires et al., The "weight smoothing" regularization of MLP for Jacobian stabilization, IEEE NEURAL, 10(6), 1999, pp. 1502-1510
Citations number
22
Categorie Soggetti
AI Robotics and Automatic Control
Journal title
IEEE TRANSACTIONS ON NEURAL NETWORKS
ISSN journal
10459227 → ACNP
Volume
10
Issue
6
Year of publication
1999
Pages
1502 - 1510
Database
ISI
SICI code
1045-9227(199911)10:6<1502:T"SROM>2.0.ZU;2-T
Abstract
In an approximation problem with a neural network, a low-output root mean s quare (rms) error is not always a universal criterion. In this paper, we in vestigate problems where the Jacobians-first derivative of an output value with respect to an input value-of the approximation model are needed and pr opose to add a quality criterion on these Jacobians during the learning ste p. More specifically, we focus here on the approximation of functionals A, from a space of continuous functions (discretized in pratice) to a scalar s pace. In this case, the approximation is confronted with the compensation p henomenon: a lower contribution of one input can be compensated by a larger one of its neighboring inputs, In this case, profiles (with respect to the input index) of neural Jacobians are very irregular instead of smooth. The n, the approximation of A becomes an ill-posed problem because many solutio ns can be chosen by the learning process. We propose to introduce the smoot hness of Jacobian profiles as an a priori: information via a regularization technique and develop a new and efficient learning algorithm, called "weig ht smoothing." We assess the robustness of the weight smoothing algorithm b y testing it on a real and complex problem stemming from meteorology: the n eural approximation of the forward model of radiative transfer equation in the atmosphere. The stabilized Jacobians of this model are then used in an inversion process to illustrate the improvement of the Jacobians after weig ht smoothing.