A learning algorithm with activation function manipulation for fault tolerant neural networks

Citation
N. Kamiura et al., A learning algorithm with activation function manipulation for fault tolerant neural networks, IEICE T INF, E84D(7), 2001, pp. 899-905
Citations number
16
Categorie Soggetti
Information Tecnology & Communication Systems
Journal title
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
ISSN journal
09168532 → ACNP
Volume
E84D
Issue
7
Year of publication
2001
Pages
899 - 905
Database
ISI
SICI code
0916-8532(200107)E84D:7<899:ALAWAF>2.0.ZU;2-0
Abstract
In this paper we propose a learning algorithm to enhance the fault toleranc e of feedforward neural networks (NNs for short) by manipulating the gradie nt of sigmoid activation function of the neuron. We assume stuck-at-0 and s tuck-at-1 faults of the connection link. For the output layer, we employ th e function with the relatively gentle gradient to enhance its fault toleran ce. For enhancing the fault tolerance of hidden layer, we steepen the gradi ent of function after convergence. The experimental results for a character recognition problem show that our NN is superior in fault tolerance, learn ing cycles and learning time to other NNs trained with the algorithms emplo ying fault injection, forcible weight limit and the calculation of relevanc e of each weight to the output error. Besides the gradient manipulation inc orporated in our algorithm never spoils the generalization ability.