作者: Robert E. Schapire , Yoram Singer
关键词: Artificial intelligence 、 Machine learning 、 LogitBoost 、 Algorithm 、 Decision tree 、 AdaBoost 、 Alternating decision tree 、 Boosting (machine learning) 、 Mathematics 、 BrownBoost 、 LPBoost 、 Multiclass classification
摘要: We describe several improvements to Freund and Schapire‘s AdaBoost boosting algorithm, particularly in a setting which hypotheses may assign confidences each of their predictions. give simplified analysis this setting, we show how can be used find improved parameter settings as well refined criterion for training weak hypotheses. specific method assigning the predictions decision trees, closely related one by Quinlan. This also suggests technique growing trees turns out identical proposed Kearns Mansour. focus next on apply new algorithms multiclass classification problems, multi-label case example belong more than class. two methods problem, plus third based output coding. One these leads handling single-label is simpler but effective techniques suggested Schapire. Finally, some experimental results comparing few discussed paper.