The constraint based decomposition (CBD) training architecture

作者: Sorin Drǎghici

DOI: 10.1016/S0893-6080(01)00040-5

关键词: AlgorithmDecision treeBenchmark (computing)Learning vector quantizationRedundancy (engineering)k-nearest neighbors algorithmArtificial neural networkBackpropagationMathematicsVector quantization

摘要: The Constraint Based Decomposition (CBD) is a constructive neural network technique that builds three or four layer network, has guaranteed convergence and can deal with binary, n-ary, class labeled real-value problems. CBD shown to be able solve complicated problems in simple, fast reliable manner. further enhanced by two modifications (locking detection redundancy elimination) which address the training speed efficiency of internal representation built network. elimination aims at building more compact architectures while locking improving speed. computational cost negligible this enhancement used for any problem. However, exponential number dimensions should only low dimensional spaces. experimental results show performance algorithm presented series classical benchmark including 2-spiral problem Iris, Wine, Glass, Lenses, Ionosphere, Lung cancer, Pima Indians, Bupa, TicTacToe, Balance Zoo data sets from UCI machine learning repository. CBD's generalization accuracy compared C4.5, C4.5 rules, incremental decision trees, oblique classifiers, linear CN2, vector quantization (LVQ), backpropagation, nearest neighbor, Q* radial basis functions (RBFs). provides second best average on tested as well reliability (the lowest standard deviation).

参考文章(83)
P. Brezillon, P. Bouquet, Lecture Notes in Artificial Intelligence ,(1999)
K. Lang, Learning to tell two spirals apart Proceedings of the 1988 Connectionist Models Summer School. ,(1989)
Carla E. Brodley, Paul E. Utgoff, Linear Machine Decision Trees University of Massachusetts. ,(1991)
Simon Kasif, Steven Salzberg, David G. Heath, Induction of Oblique Decision Trees. international joint conference on artificial intelligence. pp. 1002- 1007 ,(1993)
David Lowe, David S. Broomhead, Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks Complex Systems. ,vol. 2, pp. 321- 355 ,(1988)
Tom M Mitchell, None, Version spaces: a candidate elimination approach to rule learning international joint conference on artificial intelligence. pp. 305- 310 ,(1977)
Toshiaki Ohmameuda, Nait Charif Hammadi, Hideo Ito, Keiichi Kaneko, Dynamic Constructive Fault Tolerant Algorithm for Feedforward Neural Networks IEICE Transactions on Information and Systems. ,vol. 81, pp. 115- 123 ,(1998)
Terrence J. Sejnowski, Geoffrey E. Hinton, David Touretzsky, Proceedings of the 1988 Connectionist Models Summer School M. Kaufmann. ,(1989)