The superior fault tolerance of artificial neural network training with a fault/noise injection-based genetic algorithm.
- Author:
Feng SU
1
;
Peijiang YUAN
1
;
Yangzhen WANG
2
;
Chen ZHANG
3
Author Information
1. Robotics Institute, Beihang University, Beijing, 100191, China.
2. State Key Laboratory of Membrane Biology, School of Life Sciences, Beijing, 100871, China.
3. State Key Laboratory of Membrane Biology, School of Life Sciences, Beijing, 100871, China. ch.zhang@pku.edu.cn.
- Publication Type:Journal Article
- Keywords:
artificial neural networks;
fault tolerance;
genetic algorithm
- MeSH:
Algorithms;
Humans;
Models, Genetic;
Neural Networks (Computer)
- From:
Protein & Cell
2016;7(10):735-748
- CountryChina
- Language:English
-
Abstract:
Artificial neural networks (ANNs) are powerful computational tools that are designed to replicate the human brain and adopted to solve a variety of problems in many different fields. Fault tolerance (FT), an important property of ANNs, ensures their reliability when significant portions of a network are lost. In this paper, a fault/noise injection-based (FIB) genetic algorithm (GA) is proposed to construct fault-tolerant ANNs. The FT performance of an FIB-GA was compared with that of a common genetic algorithm, the back-propagation algorithm, and the modification of weights algorithm. The FIB-GA showed a slower fitting speed when solving the exclusive OR (XOR) problem and the overlapping classification problem, but it significantly reduced the errors in cases of single or multiple faults in ANN weights or nodes. Further analysis revealed that the fit weights showed no correlation with the fitting errors in the ANNs constructed with the FIB-GA, suggesting a relatively even distribution of the various fitting parameters. In contrast, the output weights in the training of ANNs implemented with the use the other three algorithms demonstrated a positive correlation with the errors. Our findings therefore indicate that a combination of the fault/noise injection-based method and a GA is capable of introducing FT to ANNs and imply that the distributed ANNs demonstrate superior FT performance.