作者: Junbin Gao , Paul W. Kwan , Daming Shi
DOI: 10.1016/J.NEUNET.2009.07.001
关键词:
摘要: Kernelized LASSO (Least Absolute Selection and Shrinkage Operator) has been investigated in two separate recent papers [Gao, J., Antolovich, M., & Kwan, P. H. (2008). L1 its Bayesian inference. In W. Wobcke, M. Zhang (Eds.), Lecture notes computer science: Vol. 5360 (pp. 318-324); Wang, G., Yeung, D. Y., Lochovsky, F. (2007). The kernel path kernelized LASSO. International conference on artificial intelligence statistics 580-587). San Juan, Puerto Rico: MIT Press]. This paper is concerned with learning kernels under the formulation via adopting a generative inference approach. A new robust algorithm proposed which produces sparse model capability of regularized parameters hyperparameters. comparison state-of-the-art methods for constructing regression models such as relevance vector machine (RVM) local regularization assisted orthogonal least squares (LROLS) given. also demonstrated to possess considerable computational advantages.