Fast AdaBoost training using weighted novelty selection

作者: Mojtaba Seyedhosseini , Antonio R. C. Paiva , Tolga Tasdizen

DOI: 10.1109/IJCNN.2011.6033366

关键词: Probabilistic logicBrownBoostMachine learningAdaBoostImage segmentationArtificial intelligenceTraining setNoveltyBoosting (machine learning)Discriminative modelPattern recognitionComputer science

摘要: In this paper, a new AdaBoost learning framework, called WNS-AdaBoost, is proposed for training discriminative models. The approach significantly speeds up the process of adaptive boosting (AdaBoost) by reducing number data points. For purpose, we introduce weighted novelty selection (WNS) sampling strategy and combine it with to obtain an efficient fast algorithm. WNS selects representative subset thereby points onto which applied. addition, associates weight each selected point such that approximates distribution all data. This ensures can trained efficiently minimal loss accuracy. performance WNS-AdaBoost first demonstrated in classification task. Then, employed probabilistic boosting-tree (PBT) structure image segmentation. Results these two applications show time using greatly reduced at cost only few percent

参考文章(19)
Héctor Allende-Cid, Rodrigo Salas, Héctor Allende, Ricardo Ñanculef, Robust alternating AdaBoost iberoamerican congress on pattern recognition. pp. 427- 436 ,(2007) , 10.1007/978-3-540-76725-1_45
Ayhan Demiriz, Kristin P. Bennett, John Shawe-Taylor, Linear Programming Boosting via Column Generation Machine Learning. ,vol. 46, pp. 225- 254 ,(2002) , 10.1023/A:1012470815092
Allen Gersho, Robert M. Gray, Vector Quantization and Signal Compression ,(1991)
Yoav Freund, Robert E Schapire, A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting conference on learning theory. ,vol. 55, pp. 119- 139 ,(1997) , 10.1006/JCSS.1997.1504
Jerome Friedman, Trevor Hastie, Robert Tibshirani, Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors) Annals of Statistics. ,vol. 28, pp. 337- 407 ,(2000) , 10.1214/AOS/1016218223
Robert E. Schapire, Yoram Singer, Improved boosting algorithms using confidence-rated predictions conference on learning theory. ,vol. 37, pp. 80- 91 ,(1998) , 10.1145/279943.279960
D. Comaniciu, P. Meer, Mean shift: a robust approach toward feature space analysis IEEE Transactions on Pattern Analysis and Machine Intelligence. ,vol. 24, pp. 603- 619 ,(2002) , 10.1109/34.1000236
Robert E. Schapire, The Strength of Weak Learnability Machine Learning. ,vol. 5, pp. 197- 227 ,(1990) , 10.1023/A:1022648800760
John Platt, A resource-allocating network for function interpolation Neural Computation. ,vol. 3, pp. 213- 225 ,(1991) , 10.1162/NECO.1991.3.2.213