作者: Héctor Allende-Cid , Rodrigo Salas , Héctor Allende , Ricardo Ñanculef
DOI: 10.1007/978-3-540-76725-1_45
关键词: Boosting (machine learning) 、 LogitBoost 、 Artificial intelligence 、 AdaBoost 、 Pattern recognition 、 Empirical distribution function 、 Ensemble learning 、 Empirical probability 、 Computer science 、 BrownBoost 、 Outlier
摘要: Ensemble methods are general techniques to improve the accuracy of any given learning algorithm. Boosting is a algorithm that builds classifier ensembles incrementally. In this work we propose an improvement classical and inverse AdaBoost algorithms deal with problem presence outliers in data. We Robust Alternating (RADA) alternates between classic create more stable The RADA bounds influence empirical distribution, it detects diminishes probability "bad" samples, performs accurate classification under contaminated data. We report performance results using synthetic real datasets, latter obtained from benchmark site.