作者: Lev V. Utkin , Andrea Wiencierz
DOI: 10.1016/J.INS.2015.04.037
关键词:
摘要: In this paper, generalized versions of two ensemble methods for regression based on variants the original AdaBoost algorithm are proposed. The generalization these consists in restricting unit simplex weights instances to a smaller set weighting probabilities. Various imprecise statistical models can be used obtain restricted probabilities, whose sizes each depend single parameter. For particular choices parameter, proposed algorithms reduce standard AdaBoost-based or regression. main advantage compared basic is that they have less tendency over-fitting, because hard restricted. Several simulations and applications furthermore indicate better performance comparison with corresponding methods.