Initialization of adaptive parameters in density networks

作者: Norbert Jankowski

DOI:

关键词:

摘要: Initialization of adaptive parameters in neural networks is crucial importance to the speed convergence learning procedure. Methods initialization for density are reviewed and two new methods, based on decision trees dendrograms, presented. These methods were applied Feature Space Mapping framework artificial real world datasets. Results show superiority dendrogram-based method including rotation.

参考文章(10)
Krzysztof Grabczewski, Wlodzislaw Duch, Rafal Adamczak, Masumi Ishikawa, Hiroki Ueda, Extraction of crisp logical rules using constrained backpropagation networks. the european symposium on artificial neural networks. ,(1997)
Christopher M. Bishop, Neural networks for pattern recognition ,(1995)
Rafał Adamczak, Norbert Jankowski, New developments in the Feature Space Mapping model ,(2000)
Teuvo Kohonen, Self-Organizing Maps ,(1995)
Visakan Kadirkamanathan, Mahesan Niranjan, A function estimation approach to sequential learning with neural networks Neural Computation. ,vol. 5, pp. 954- 975 ,(1993) , 10.1162/NECO.1993.5.6.954
Włodzisław Duch, Geerd H.F. Diercksen, Feature space mapping as a universal adaptive system Computer Physics Communications. ,vol. 87, pp. 341- 371 ,(1995) , 10.1016/0010-4655(95)00023-9
David J. C. MacKay, A practical Bayesian framework for backpropagation networks Neural Computation. ,vol. 4, pp. 448- 472 ,(1992) , 10.1162/NECO.1992.4.3.448
I Tarassenko, S Roberts, Supervised and unsupervised learning in radial basis function classifiers IEE Proceedings - Vision, Image, and Signal Processing. ,vol. 141, pp. 210- 216 ,(1994) , 10.1049/IP-VIS:19941324
John Platt, A resource-allocating network for function interpolation Neural Computation. ,vol. 3, pp. 213- 225 ,(1991) , 10.1162/NECO.1991.3.2.213
C. Chinrungrueng, C.H. Sequin, Optimal adaptive k-means algorithm with dynamic adjustment of learning rate IEEE Transactions on Neural Networks. ,vol. 6, pp. 157- 169 ,(1995) , 10.1109/72.363440