作者: Shikui Tu , Lei Xu
DOI: 10.1007/S11460-011-0150-2
关键词:
摘要: How parameterizations affect model selection performance is an issue that has been ignored or seldom studied since traditional criteria, such as Akaike’s information criterion (AIC), Schwarz’s Bayesian (BIC), difference of negative log-likelihood (DNLL), etc., perform equivalently on different have equivalent likelihood functions. For factor analysis (FA), in addition to one (shortly denoted by FA-a), it was previously found there another parameterization FA-b) and the Ying-Yang (BYY) harmony learning gets performances FA-a FA-b. This paper investigates a family FA functions, where each FA-r) featured integer r, with end r = 0 FA-b other reaches its upper-bound. In BYY comparison AIC, BIC, DNLL, we also implement variational Bayes (VB). Several empirical finds obtained via extensive experiments. First, both VB obviously better than FA-a, this superiority reliable robust. Second, outperform while further outperforms considerably, especially Moreover, replaced FA-b, gain higher VB, no DNLL. Third, demonstrates how part priors incrementally jointly improves performances, shows using optimize hyperparameters deteriorates for purpose can improve performances.