Bias-variance, regularization, instability and stabilization

作者: L. Breiman

DOI:

关键词:

摘要: This chapter is concerned with some of the fundamentals in predicting numerical outcomes - known statistics as regression. Here a road map: Using test set definition mean-squared prediction error, we will see that error can be decomposed into two major components bias and variance. The idea regularization to construct sequence predictors begin high variance-low go low variance-high bias. Then problem becomes selecting member this having lowest bias-variance sum. How well are able do turns out depend on stability method for constructing predictors. Many methods, including decision trees neural nets unstable. But unstable procedures stabilized leading significant improvements accuracy.

参考文章(0)