摘要: We present a common probabilistic framework for kernel or spline smooth- ing methods, including popular architectures such as Gaussian processes and Support Vector machines. identify the problem of unnormalized loss func- tions suggest general technique to overcome this at least ap- proximately. give an intuitive interpretation effect function can induce, by comparing classification (SVC) with process (GPC) nonparametric generalization logistic regression. This relates SVC boosting techniques. propose variational Bayesian model selection algorithm nor- malized functions. has wider applicability than other previously suggested techniques exhibits comparable perfor- mance in cases where both are applicable. discuss results substantial number experiments which we applied vari- ational real-world tasks compared it range known methods. The scope thesis is provide bridge between fields Statistical Learning Theory, some material tutorial nature hope will be useful researchers fields.