作者: Neil D. Lawrence , Lorenzo Rosasco , Mauricio A. Álvarez
DOI:
关键词:
摘要: Kernel methods are among the most popular techniques in machine learning. From a regularization perspective they play central role theory as provide natural choice for hypotheses space and functional through notion of reproducing kernel Hilbert spaces. probabilistic key context Gaussian processes, where function is known covariance function. Traditionally, have been used supervised learning problems with scalar outputs indeed there has considerable amount work devoted to designing kernels. More recently an increasing interest that deal multiple outputs, motivated partially by frameworks like multitask In this monograph, we review different design or learn valid functions paying particular attention connection between methods.