A Bayesian Random Split to Build Ensembles of Classification Trees

作者: Andrés Cano , Andrés R. Masegosa , Serafín Moral

DOI: 10.1007/978-3-642-02906-6_41

关键词:

摘要: Random forest models [1] consist of an ensemble randomized decision trees. It is one the best performing classification models. With this idea in mind, section we introduced a random split operator based on Bayesian approach for building forest. The convenience method constructing ensembles trees justified with error bias-variance decomposition analysis. This new does not clearly depend parameter K as its forest's counterpart, and performs better lower number

参考文章(17)
Ron Kohavi, David Wolpert, Bias plus variance decomposition for zero-one loss functions international conference on machine learning. pp. 275- 283 ,(1996)
Janez Demšar, Statistical Comparisons of Classifiers over Multiple Data Sets Journal of Machine Learning Research. ,vol. 7, pp. 1- 30 ,(2006)
Geoffrey I. Webb, Paul Conilione, Geoffrey Webb, Estimating bias and variance from data ,(2003)
Mark A. Hall, Ian H. Witten, Eibe Frank, Data Mining: Practical Machine Learning Tools and Techniques ,(1999)
Keki B. Irani, Usama M. Fayyad, Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning international joint conference on artificial intelligence. ,vol. 2, pp. 1022- 1027 ,(1993)
Richard A Olshen, Charles J Stone, Leo Breiman, Jerome H Friedman, Classification and regression trees ,(1983)
Andrés R. Masegosa, Joaquín Abellán, Combining Decision Trees Based on Imprecise Probabilities and Uncertainty Measures european conference on symbolic and quantitative approaches to reasoning and uncertainty. pp. 512- 523 ,(2007) , 10.1007/978-3-540-75256-1_46
John Mingers, An Empirical Comparison of Selection Measures for Decision-Tree Induction Machine Learning. ,vol. 3, pp. 319- 342 ,(1989) , 10.1023/A:1022645801436
Pierre Geurts, Damien Ernst, Louis Wehenkel, Extremely randomized trees Machine Learning. ,vol. 63, pp. 3- 42 ,(2006) , 10.1007/S10994-006-6226-1