A compressed sensing approach for efficient ensemble learning

作者: Lin Li , Rustam Stolkin , Licheng Jiao , Fang Liu , Shuang Wang

DOI: 10.1016/J.PATCOG.2014.04.015

关键词: Ensemble learningMachine learningTraining setLinear systemArtificial intelligencePattern recognitionCompressed sensingClassifier (UML)Random subspace methodCascading classifiersFitness proportionate selectionComputer science

摘要: Abstract This paper presents a method for improved ensemble learning, by treating the optimization of an classifiers as compressed sensing problem. Ensemble learning methods improve performance learned predictor integrating weighted combination multiple predictive models. Ideally, number models needed in should be minimized, while optimizing weights associated with each included model. We solve this problem it example problem, which sparse solution must reconstructed from under-determined linear system. Compressed techniques are then employed to find is both small and effective. An additional contribution paper, present new evaluation (a pairwise diversity measurement) called roulette-wheel kappa-error. takes into account different weightings classifiers, also reduces total pairs kappa-error diagram, selecting through selection according classifiers. approach can greatly clarity informativeness especially when large. use 25 public data sets evaluate compare ensembles using four reconstruction algorithms, combined two classifier algorithms training manipulation techniques. give comparison experiments our against another five state-of-the-art pruning methods. These show that produces comparable or better accuracy, being significantly faster than compared

参考文章(42)
V.N. Temlyakov, The best m-term approximation and greedy algorithms Advances in Computational Mathematics. ,vol. 8, pp. 249- 265 ,(1998) , 10.1023/A:1018900431309
Ludmila I. Kuncheva, That Elusive Diversity in Classifier Ensembles iberian conference on pattern recognition and image analysis. ,vol. 2652, pp. 1126- 1138 ,(2003) , 10.1007/978-3-540-44871-6_130
Robert E. Schapire, A brief introduction to boosting international joint conference on artificial intelligence. ,vol. 2, pp. 1401- 1406 ,(1999)
David E. Goldberg, Kalyanmoy Deb, A Comparative Analysis of Selection Schemes Used in Genetic Algorithms Foundations of Genetic Algorithms. ,vol. 1, pp. 69- 93 ,(1991) , 10.1016/B978-0-08-050684-5.50008-2
Mark D. Plumbley, Recovery of Sparse Representations by Polytope Faces Pursuit Independent Component Analysis and Blind Signal Separation. pp. 206- 213 ,(2006) , 10.1007/11679363_26
Richard A Olshen, Charles J Stone, Leo Breiman, Jerome H Friedman, Classification and regression trees ,(1983)
Ludmila I Kuncheva, Christopher J Whitaker, Ten measures of diversity in classifier ensembles: limits for two classifiers DERA/IEE Workshop Intelligent Sensor Processing. pp. 10- 10 ,(2001) , 10.1049/IC:20010105
LiCheng Jiao, Lin Li, RongHua Shang, Fang Liu, Rustam Stolkin, A novel selection evolutionary strategy for constrained optimization Information Sciences. ,vol. 239, pp. 122- 141 ,(2013) , 10.1016/J.INS.2013.03.002
Robert E. Schapire, Yoav Freund, Peter Bartlett, Wee Sun Lee, Boosting the margin: a new explanation for the effectiveness of voting methods Annals of Statistics. ,vol. 26, pp. 1651- 1686 ,(1998) , 10.1214/AOS/1024691352