Criteria for generalized linear model selection based on Kullback's symmetric divergence

作者: Cristina Laura Acion

DOI: 10.17077/ETD.XIBXPNC3

关键词: Akaike information criterionDivergence (statistics)Generalized linear modelStatistical modelApplied mathematicsBias of an estimatorStatisticsMathematicsGeneralized linear mixed modelModel selectionEstimator

摘要: Model selection criteria frequently arise from constructing estimators of discrepancy measures used to assess the disparity between data generating model and a fitted approximating model. The widely known Akaike information criterion (AIC) results utilizing Kullback’s directed divergence (KDD) as targeted discrepancy. Under appropriate conditions, AIC serves an asymptotically unbiased estimator KDD. is asymmetric measure separation two statistical models, meaning that alternate may be obtained by reversing roles models in definition measure. sum divergences symmetric (KSD). A comparison indicates important distinction measures. When evaluate are improperly specified, which basis for more sensitive towards detecting overfitted whereas its counterpart underfitted models. Since KSD combines both measures, it functions gauge arguably balanced than either individual components. With this motivation, we propose three use setting generalized linear models: KICo, KICu, QKIC. These statistics function under different assumptions frameworks. As with AIC, KICo KICu justified large-sample maximum likelihood settings; however, asymptotic unbiasedness holds general AIC. settings where distribution response

参考文章(68)
Calyampudi Radhakrishna Rao, Yuehua Wu, Sadanori Konishi, Rahul Mukerjee, On model selection Institute of Mathematical Statistics. pp. 1- 57 ,(2001) , 10.1214/LNMS/1215540960
Ritei Shibata, Statistical aspects of model selection From data to model. pp. 215- 240 ,(1989) , 10.1007/978-3-642-75007-6_5
Genshiro Kitagawa, Sadanori Konishi, Information Criteria and Statistical Modeling ,(2007)
Hirotugu Akaike, Prediction and Entropy Springer, New York, NY. pp. 1- 24 ,(1985) , 10.1007/978-1-4613-8560-8_1
Inbal Yahav, Galit Shmueli, Robert H. Smith, An Elegant Method for Generating Multivariate Poisson Random Variables arXiv: Computation. ,(2008)
Nariaki Sugiura, Further analysts of the data by akaike' s information criterion and the finite corrections Communications in Statistics-theory and Methods. ,vol. 7, pp. 13- 26 ,(1978) , 10.1080/03610927808827599
Joseph M. Hilbe, James W. Hardin, Generalized Estimating Equations ,(2002)
S. Kullback, R. A. Leibler, On Information and Sufficiency Annals of Mathematical Statistics. ,vol. 22, pp. 79- 86 ,(1951) , 10.1214/AOMS/1177729694
Joseph E Cavanaugh, A large-sample model selection criterion based on Kullback's symmetric divergence Statistics & Probability Letters. ,vol. 42, pp. 333- 343 ,(1999) , 10.1016/S0167-7152(98)00200-4
CLIFFORD M. HURVICH, CHIH-LING TSAI, Regression and time series model selection in small samples Biometrika. ,vol. 76, pp. 297- 307 ,(1989) , 10.1093/BIOMET/76.2.297