The local minima-free condition of feedforward neural networks for outer-supervised learning

作者: De-Shuang Huang

DOI: 10.1109/3477.678658

关键词:

摘要: In this paper, the local minima-free conditions of outer-supervised feedforward neural networks (FNN) based on batch-style learning are studied by means embedded subspace method. It is proven that only if rendition number hidden neurons not less than training samples, which sufficient but necessary, satisfied, network will necessarily converge to global minima with null cost, and condition range space signal matrix included in output Is necessary for error surface. addition, under being samples greater neurons, it demonstrated there also exist cost surface first layer weights adequately selected.

参考文章(7)
Roger A. Horn, Charles R. Johnson, Matrix Analysis Cambridge University Press. ,(1985) , 10.1017/CBO9780511810817
M. Gori, A. Tesi, On the problem of local minima in backpropagation IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 14, pp. 76- 86 ,(1992) , 10.1109/34.107014
Hush, Salas, Improving the learning rate of back-propagation with the gradient reuse algorithm IEEE 1988 International Conference on Neural Networks. pp. 441- 447 ,(1988) , 10.1109/ICNN.1988.23877
M.A. Sartori, P.J. Antsaklis, A simple method to derive bounds on the size and to train multilayer neural networks IEEE Transactions on Neural Networks. ,vol. 2, pp. 467- 471 ,(1991) , 10.1109/72.88168
Xiao-Hu Yu, Guo-An Chen, On the local minima free condition of backpropagation learning IEEE Transactions on Neural Networks. ,vol. 6, pp. 1300- 1303 ,(1995) , 10.1109/72.410380
Xiao-Hu Yu, Can backpropagation error surface not have local minima IEEE Transactions on Neural Networks. ,vol. 3, pp. 1019- 1021 ,(1992) , 10.1109/72.165604
M.R. Azimi-Sadjadi, R.-J. Liou, Fast learning process of multilayer neural networks using recursive least squares method IEEE Transactions on Signal Processing. ,vol. 40, pp. 446- 450 ,(1992) , 10.1109/78.124956