作者: Junbin Gao , Paul W. Kwan , Yi Guo
DOI: 10.1016/J.NEUCOM.2008.01.027
关键词: Pattern recognition 、 Algorithm 、 Laplace distribution 、 Dimensionality reduction 、 Multivariate statistics 、 Sparse PCA 、 Mathematics 、 Principal component analysis 、 Expectation–maximization algorithm 、 Heavy-tailed distribution 、 Artificial intelligence 、 Bayesian inference
摘要: Further to our recent work on the robust L1 PCA we introduce a new version of model based so-called multivariate Laplace distribution (called distribution) proposed in Eltoft et al. [2006. On distribution. IEEE Signal Process. Lett. 13(5), 300-303]. Due heavy tail and high component dependency characteristics distribution, is expected be more against data outliers fitting dependency. Additionally, demonstrate how variational approximation scheme enables effective inference key parameters probabilistic L1-PCA model. By doing so, tractable Bayesian can achieved EM-type algorithm.