- 作者: UH-YUN SHER ; WEN-SHONG HSIEH
- 中文摘要: Fault tolerance is one of the key performance measures of artificial neural networks (ANN's) and is often viewed as an inherent feature of ANN's. But without precise designing, it is not able to guarantee the degree of fault tolerance. This paper presents an extensive study on the fault tolerant property of feedforward neural networks. We propose a constraint backpropagation (CBP) training method, which can guarantee a high degree of fault tolerance when one or two hidden nodes fail. In order to achieve the goal of fault tolerance, we define an energy term, called constraint energy, that measures the performance degradation when some hidden nodes fail. During training, both the normal energy and the constraint energy will be minimized. We also develop a simple technique called output node saturation (ONS). By incorporating CBP with ONS, we can find a network which maintains exactly the same performance as a normal network when some hidden nodes fail. Experimental results show that a network trained by CBP also possess better generalization properties than does one trained by normal backpropagation (BP).
- 英文摘要: --
- 中文關鍵字: feedforward networks, fault tolerance, backpropagation, output nodes saturation, constraint backpropagation
- 英文關鍵字: --