作者: John Lafferty
关键词: Pattern recognition 、 Exponential function 、 Applied mathematics 、 Entropy (information theory) 、 Inference 、 AdaBoost 、 Quadratic equation 、 Artificial intelligence 、 Additive model 、 Bregman divergence 、 Legendre transformation 、 Mathematics
摘要: We present a framework for designing incremental learning algorithms derived from generalized entropy functionals. Our approach is based on the use of Bregman divergences together with associated class additive models constructed using Legendre transform. A particular one-parameter family shown to yield loss functions that includes log-likelihood criterion logistic regression as special case, and closely approximates exponential used in AdaBoost Schapire et a/., natural parameter varies. also show how quadratic approximation gain divergence results weighted least-squares criterion. This leads builds upon extends recent interpretation boosting terms proposed by Friedman, Hastie, Tibshirani.