作者: Qiang Li Zhao , Yan Huang Jiang , Ming Xu
DOI: 10.1007/978-3-642-17313-4_1
关键词: AdaBoost 、 Ensemble learning 、 Machine learning 、 Classifier (UML) 、 Artificial intelligence 、 Incremental learning 、 Computer science
摘要: Classifier ensemble is a main direction of incremental learning researches, and many ensemble-based methods have been presented. Among them, Learn++, which derived from the famous algorithm, AdaBoost, special. Learn++ can work with any type classifiers, either they are specially designed for or not, this makes potentially supports heterogeneous base classifiers. Based on massive experiments we analyze advantages disadvantages Learn++. Then new method, Bagging++, presented, based another method: Bagging. The experimental results show that Bagging promising method Bagging++ has better generalization speed than other compared such as NCL.