作者: Leandro Pardo
DOI:
关键词:
摘要: DIVERGENCE MEASURES: DEFINITION AND PROPERTIES Introduction Phi-divergence. Measures between Two Probability Distributions: Definition and Properties Other Divergence Distributions among k Populations Phi-disparities Exercises Answers to ENTROPY AS A MEASURE OF DIVERSITY: SAMPLING DISTRIBUTIONS Phi-entropies. Asymptotic Distribution Testing Confidence Intervals for Phi-entropies Multinomial Populations: Maximum Entropy Principle Statistical Inference on Condensed Ordered Data GOODNESS-OF-FIT: SIMPLE NULL HYPOTHESIS Phi-divergences Goodness-of-fit with Fixed Number of Classes Phi-divergence Test Statistics under Sparseness Assumptions Nonstandard Problems: Tests based OPTIMALITY PHI-DIVERGENCE TEST STATISTICS IN GOODNESS-OF-FIT Effciency Exact Moments: Comparison Second Order Approximation the Powers Based Critical Regions Small Sample Comparisons MINIMUM ESTIMATORS Likelihood Minimum Estimators Estimator Normal Mixtures: Constraints: COMPOSITE Loglinear Models Using Models: Results in Simulation Study MEASURES CONTINGENCY TABLES Independence Symmetry Marginal Homogeneity Quasi-symmetry TESTING GENERAL POPULATIONS Simple Null Hypotheses: Wald, Rao, Wilks Composite Hypothesis Multi-sample Problem Some Topics Multivariate Analysis References Index