作者: Augustine Kong , Jun S. Liu , Wing Hung Wong
DOI: 10.1080/01621459.1994.10476469
关键词: Missing data 、 Model selection 、 Bayesian inference 、 Gibbs sampling 、 Statistics 、 Algorithm 、 Bayesian probability 、 Mathematics 、 Expectation–maximization algorithm 、 Posterior probability 、 Importance sampling
摘要: Abstract For missing data problems, Tanner and Wong have described a augmentation procedure that approximates the actual posterior distribution of parameter vector by mixture complete posteriors. Their method constructing sets is closely related to Gibbs sampler. Both required iterations, and, similar EM algorithm, convergence can be slow. We introduce in this article an alternative involves imputing sequentially computing appropriate importance sampling weights. In many applications new works very well without need for iterations. Sensitivity analysis, influence updating with performed cheaply. Bayesian prediction model selection also incorporated. Examples taken from wide range are used illustration.