作者: Y. Normandin , S.D. Morgera
DOI: 10.1109/ICASSP.1991.150395
关键词: Estimation theory 、 Word error rate 、 Markov process 、 NIST 、 Mutual information 、 Algorithm 、 Gaussian 、 Information theory 、 Rate of convergence 、 Computer science 、 Speech recognition 、 Hidden Markov model 、 Vocabulary
摘要: Recently, Gopalakrishnan et al. (1989) introduced a reestimation formula for discrete HMMs (hidden Markov models) which applies to rational objective functions like the MMIE (maximum mutual information estimation) criterion. The authors analyze and show how its convergence rate can be substantially improved. They introduce corrective training algorithm, which, when applied TI/NIST connected digit database, has made it possible reduce string error by close 50%. Gopalakrishnan's result is extended continuous case proposing new estimating mean variance parameters of diagonal Gaussian densities. >