Objective functions for training new hidden units in constructive neural networks

作者: Yeung Kwok , None

DOI: 10.1109/72.623214

关键词:

摘要: In this paper, we study a number of objective functions for training new hidden units in constructive algorithms multilayer feedforward networks. The aim is to derive class the computation which and corresponding weight updates can be done O(N) time, where N patterns. Moreover, even though input freezing applied during process computational efficiency, convergence property using these still preserved. We also propose few tricks that used improve optimization under practical situations. Their relative performance set two-dimensional regression problems discussed.

参考文章(64)
David G. Luenberger, Yinyu Ye, Linear and nonlinear programming ,(1984)
D. J. H. Garling, A. N. Kolmogorov, S. V. Fomin, Richard A. Silverman, Introductory Real Analysis The Mathematical Gazette. ,vol. 56, pp. 75- ,(1972) , 10.2307/3613747
David J. C. MacKay, Bayesian interpolation Neural Computation archive. ,vol. 4, pp. 415- 447 ,(1992) , 10.1162/NECO.1992.4.3.415
Vijay K. Rohatgi, Statistical Inference ,(1984)
Jeng-Neng Hwang, Shyh-Rong Lay, Martin Maechler, R Douglas Martin, Jim Schimert, Regression modeling in backpropagation and projection pursuit learning IEEE Transactions on Neural Networks. ,vol. 5, pp. 342- 353 ,(1994)
Blake LeBaron, Andreas S. Weigend, Evaluating Neural Network Predictors by Bootstrapping The Finance. ,(1994)
M. Stone, Cross-Validatory Choice and Assessment of Statistical Predictions (With Discussion) Journal of the Royal Statistical Society: Series B (Methodological). ,vol. 38, pp. 102- 102 ,(1976) , 10.1111/J.2517-6161.1976.TB01573.X
WJ Fitzgerald, TT Jervis, Optimization schemes for neural networks University of Cambridge: Department of Engineering. ,(1993)