Parameterized AdaBoost: Introducing a Parameter to Speed Up the Training of Real AdaBoost

作者: Shuqiong Wu , Hiroshi Nagahashi

DOI: 10.1109/LSP.2014.2313570

关键词: Parameterized complexityObject detectionMachine learningAdaBoostLogitBoostComputer scienceLPBoostData classificationBrownBoostGeneralization errorArtificial intelligencePattern recognitionBoosting (machine learning)

摘要: As a machine learning algorithm, AdaBoost has obtained considerable success in data classification and object detection. Later its generalized version called Real was proposed by Schapire Singer. increases weights for misclassified samples decreases correctly classified every iteration. This kind of weight adjustment focuses on the with large tries to make them future runs. However, it may lead misclassification some other that have been previous If we can curb this during boosting process, faster training be achieved. Based assumption, propose Parameterized which parameter is devised penalize already classified. Then analyse positive margins are more than AdaBoost. Experimental results show our approach achieves convergence error also improves generalization degree when compared

参考文章(16)
G. Rätsch, T. Onoda, K.-R. Müller, Soft Margins for AdaBoost Machine Learning. ,vol. 42, pp. 287- 320 ,(2001) , 10.1023/A:1007618119488
Yijun Sun, Jian Li, W. Hager, Two new regularized AdaBoost algorithms international conference on machine learning and applications. pp. 41- 48 ,(2004) , 10.1109/ICMLA.2004.1383492
Robert E. Schapire, Yoav Freund, Peter Bartlett, Wee Sun Lee, Boosting the margin: a new explanation for the effectiveness of voting methods Annals of Statistics. ,vol. 26, pp. 1651- 1686 ,(1998) , 10.1214/AOS/1024691352
Jerome Friedman, Trevor Hastie, Robert Tibshirani, Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors) Annals of Statistics. ,vol. 28, pp. 337- 407 ,(2000) , 10.1214/AOS/1016218223
Robert E. Schapire, Yoram Singer, Improved boosting algorithms using confidence-rated predictions conference on learning theory. ,vol. 37, pp. 80- 91 ,(1998) , 10.1145/279943.279960
Jingsong Xu, Qiang Wu, Jian Zhang, Zhenmin Tang, Fast and Accurate Human Detection Using a Cascade of Boosted MS-LBP Features IEEE Signal Processing Letters. ,vol. 19, pp. 676- 679 ,(2012) , 10.1109/LSP.2012.2210870
Yanfeng Zhang, Peikun He, A revised AdaBoost algorithm: FM-AdaBoost international conference on computer application and system modeling. ,vol. 11, ,(2010) , 10.1109/ICCASM.2010.5623209
Mojtaba Seyedhosseini, Antonio R. C. Paiva, Tolga Tasdizen, Fast AdaBoost training using weighted novelty selection international joint conference on neural network. pp. 1245- 1250 ,(2011) , 10.1109/IJCNN.2011.6033366
Biswajit Paul, G. Athithan, M. Narasimha Murty, Speeding up AdaBoost Classifier with Random Projection international conference on advances in pattern recognition. pp. 251- 254 ,(2009) , 10.1109/ICAPR.2009.67
Jing-Ming Guo, Chen-Chi Lin, Min-Feng Wu, Che-Hao Chang, Hua Lee, Complexity Reduced Face Detection Using Probability-Based Face Mask Prefiltering and Pixel-Based Hierarchical-Feature Adaboosting IEEE Signal Processing Letters. ,vol. 18, pp. 447- 450 ,(2011) , 10.1109/LSP.2011.2146772