SampleBoost: Improving boosting performance by destabilizing weak learners based on weighted error analysis

作者: Xiaohui Yuan , Mohamed Abouelenien

DOI:

关键词:

摘要: Learning from large, multi-class data sets poses great challenges to ensemble methods. The weak learner condition makes the conventional method inappropriate handle classification, which leads early termination of training process. Also, elongated time learning large set infeasible. To circumvent these issues, we present a novel that integrates sampling strategy and an error parameter alters weighted error. Experiments were conducted with ten real-world applications. It is evident our proposed achieves greater performance avoids termination. In addition, significantly improves efficiency accommodates set.

参考文章(7)
Robert E. Schapire, Theoretical Views of Boosting and Applications algorithmic learning theory. pp. 13- 25 ,(1999) , 10.1007/3-540-46769-6_2
Mojtaba Seyedhosseini, Antonio R. C. Paiva, Tolga Tasdizen, Fast AdaBoost training using weighted novelty selection international joint conference on neural network. pp. 1245- 1250 ,(2011) , 10.1109/IJCNN.2011.6033366
Chris Seiffert, Taghi M. Khoshgoftaar, Jason Van Hulse, Amri Napolitano, RUSBoost: A Hybrid Approach to Alleviating Class Imbalance systems man and cybernetics. ,vol. 40, pp. 185- 197 ,(2010) , 10.1109/TSMCA.2009.2029559
A.S. Georghiades, P.N. Belhumeur, D.J. Kriegman, From few to many: illumination cone models for face recognition under variable lighting and pose IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 23, pp. 643- 660 ,(2001) , 10.1109/34.927464
Charles Dubout, Francois Fleuret, Boosting with Maximum Adaptive Sampling neural information processing systems. ,vol. 24, pp. 1332- 1340 ,(2011)
Kevin Bache, Moshe Lichman, UCI Machine Learning Repository University of California, School of Information and Computer Science. ,(2007)