作者: Daniela Hofmann , Andrej Gisbrecht , Barbara Hammer
DOI: 10.1007/978-3-642-35230-0_19
关键词: Kernel (image processing) 、 Gramian matrix 、 Data structure 、 Probabilistic logic 、 Mixture model 、 Limit (mathematics) 、 Algorithm 、 Support vector machine 、 Learning vector quantization 、 Computer science
摘要: Robust soft learning vector quantization (RSLVQ) constitutes a probabilistic extension of (LVQ) based on labeled Gaussian mixture model the data. Training optimizes likelihood ratio and recovers variant similar to LVQ2.1 in limit small bandwidth. Recently, RSLVQ has been extended kernel version, thus opening way towards more general data structures characterized terms Gram matrix only. While leading state art results, this drawback that models are no longer sparse, quadratic training complexity is encountered. In contribution, we investigate two approximation schemes which lead sparse models: k-approximations prototypes Nystrom matrix. We behavior these approximations couple benchmarks.