作者: Cristina Laura Acion
关键词: Akaike information criterion 、 Divergence (statistics) 、 Generalized linear model 、 Statistical model 、 Applied mathematics 、 Bias of an estimator 、 Statistics 、 Mathematics 、 Generalized linear mixed model 、 Model selection 、 Estimator
摘要: Model selection criteria frequently arise from constructing estimators of discrepancy measures used to assess the disparity between data generating model and a fitted approximating model. The widely known Akaike information criterion (AIC) results utilizing Kullback’s directed divergence (KDD) as targeted discrepancy. Under appropriate conditions, AIC serves an asymptotically unbiased estimator KDD. is asymmetric measure separation two statistical models, meaning that alternate may be obtained by reversing roles models in definition measure. sum divergences symmetric (KSD). A comparison indicates important distinction measures. When evaluate are improperly specified, which basis for more sensitive towards detecting overfitted whereas its counterpart underfitted models. Since KSD combines both measures, it functions gauge arguably balanced than either individual components. With this motivation, we propose three use setting generalized linear models: KICo, KICu, QKIC. These statistics function under different assumptions frameworks. As with AIC, KICo KICu justified large-sample maximum likelihood settings; however, asymptotic unbiasedness holds general AIC. settings where distribution response