Ensemble Construction via Designed Output Distortion

作者: Stefan W. Christensen

DOI: 10.1007/3-540-44938-8_29

关键词:

摘要: A new technique for generating regression ensembles is introduced in the present paper. The based on earlier work promoting model diversity through injection of noise into outputs; it differs from methods its rigorous requirement that mean displacements applied to any data points output value be exactly zero. It illustrated how even introduction extremely large may lead prediction accuracy superior achieved by bagging. It demonstrated models with very high bias have much better than single same bias-defying conventional belief ensembling not purposeful. Finally outlined classification.

参考文章(12)
Amanda J. C. Sharkey, Variance Reduction via Noise and Bias Constraints Perspectives in Neural Computing. pp. 163- 178 ,(1999) , 10.1007/978-1-4471-0793-4_7
HP William, PF Brian, AT Saul, TV William, Numerical Recipes in Pascal ,(1989)
Leo Breiman, Randomizing Outputs to Increase Prediction Accuracy Machine Learning. ,vol. 40, pp. 229- 242 ,(2000) , 10.1023/A:1007682208299
AMANDA J. C SHARKEY, On Combining Artificial Neural Nets Connection Science. ,vol. 8, pp. 299- 314 ,(1996) , 10.1080/095400996116785
YUVAL RAVIV, NATHAN INTRATOR, Bootstrapping with Noise: An Effective Regularization Technique Connection Science. ,vol. 8, pp. 355- 372 ,(1996) , 10.1080/095400996116811
Anders Krogh, Jesper Vedelsby, Neural Network Ensembles, Cross Validation, and Active Learning neural information processing systems. ,vol. 7, pp. 231- 238 ,(1994)
Peter Sollich, Anders Krogh, Learning with ensembles: How overfitting can be useful neural information processing systems. ,vol. 8, pp. 190- 196 ,(1995)
J. A. Nelder, R. Mead, A Simplex Method for Function Minimization The Computer Journal. ,vol. 7, pp. 308- 313 ,(1965) , 10.1093/COMJNL/7.4.308
Leo Breiman, Bagging predictors Machine Learning archive. ,vol. 24, pp. 123- ,(1996) , 10.1023/A:1018054314350