DOI: 10.1109/TELFOR.2014.7034441
关键词:
摘要: This paper proposes a model which approximates full covariance matrices in Gaussian mixture models (GMM) with reduced number of parameters and computations required for likelihood evaluations. In the proposed inverse (precision) are approximated using sparsely represented eigenvectors, i.e. each eigenvector covariance/precision matrix is as linear combination small vectors from an overcomplete dictionary. A maximum algorithm parameter estimation its practical implementation presented. Experimental results on speech recognition task show that while keeping word error rate close to one obtained by GMMs matrices, can reduce 45%.