作者: Holger Dette , Roman Guchenko , Viatcheslav B. Melas
DOI: 10.1080/10618600.2016.1195272
关键词: Kullback–Leibler divergence 、 Prior probability 、 Mathematics 、 Bayesian experimental design 、 Artificial intelligence 、 Focus (optics) 、 Computation 、 Algorithm 、 Design of experiments 、 Machine learning 、 Bayesian probability 、 Homoscedasticity
摘要: ABSTRACTAn efficient algorithm for the determination of Bayesian optimal discriminating designs competing regression models is developed, where main focus on with general distributional assumptions beyond “classical” case normally distributed homoscedastic errors. For this purpose, we consider a version Kullback–Leibler (KL). Discretizing prior distribution leads to local KL-optimal design problems large number models. All currently available methods either require amount computation time or fail calculate design, because they can only deal efficiently few model comparisons. In article, develop new respect criterion. It demonstrated that able reasonable accuracy and computational in situa...