Automatic Selection of Split Criterion during Tree Growing Based on Node Location

作者: Carla E. Brodley

DOI: 10.1016/B978-1-55860-377-6.50018-9

关键词: Fractal tree indexAlgorithmSearch treeArtificial intelligenceTree (data structure)Interval treeVantage-point treePattern recognitionTree traversalMathematicsIncremental decision treeSegment tree

摘要: Abstract Typically, decision tree construction algorithms apply a single “goodness of split” criterion to form each test node the tree. It is hypothesis this research that better results can be obtained if during one applies split suited “location” in Specifically, given objective maximizing predictive accuracy, nodes near root should chosen using measure based on information theory, whereas closer leaves pruned maximize classification accuracy training set. The an empirical evaluation illustrate adapting location improve performance.

参考文章(18)
Keki B. Irani, Usama M. Fayyad, The attribute selection problem in decision tree generation national conference on artificial intelligence. pp. 104- 110 ,(1992)
David Tcheng, Bruce Lambert, Larry Rendell, Stephen C-Y. Lu, Building robust learning systems by combining induction and optimization international joint conference on artificial intelligence. pp. 806- 812 ,(1989)
Michael Pazzani, Christopher Merz, Patrick Murphy, Kamal Ali, Timothy Hume, Clifford Brunk, Reducing Misclassification Costs Machine Learning Proceedings 1994. pp. 217- 225 ,(1994) , 10.1016/B978-1-55860-335-6.50034-9
Wray Buntine, Tim Niblett, A Further Comparison of Splitting Rules for Decision-Tree Induction Machine Learning. ,vol. 8, pp. 75- 85 ,(1992) , 10.1023/A:1022686419106
Laurent Hyafil, Ronald L. Rivest, Constructing optimal binary decision trees is NP-complete☆ Information Processing Letters. ,vol. 5, pp. 15- 17 ,(1976) , 10.1016/0020-0190(76)90095-8
Robert Detrano, Andras Janosi, Walter Steinbrunn, Matthias Pfisterer, Johann-Jakob Schmid, Sarbjit Sandhu, Kern H. Guppy, Stella Lee, Victor Froelicher, International application of a new probability algorithm for the diagnosis of coronary artery disease American Journal of Cardiology. ,vol. 64, pp. 304- 310 ,(1989) , 10.1016/0002-9149(89)90524-9
Cullen Schaffer, Overfitting Avoidance as Bias Machine Learning. ,vol. 10, pp. 153- 178 ,(1993) , 10.1023/A:1022653209073
Carla E. Brodley, Paul E. Utgoff, Multivariate Decision Trees Machine Learning. ,vol. 19, pp. 45- 77 ,(1995) , 10.1023/A:1022607123649
W.Z. Liu, A.P. White, The Importance of Attribute Selection Measures in Decision Tree Induction Machine Learning. ,vol. 15, pp. 25- 41 ,(1994) , 10.1023/A:1022609119415
Usama M. Fayyad, Keki B. Irani, On the Handling of Continuous-Valued Attributes in Decision Tree Generation Machine Learning. ,vol. 8, pp. 87- 102 ,(1992) , 10.1023/A:1022638503176