Modeling Agreement among Raters

作者: Martin A. Tanner , Michael A. Young

DOI: 10.1080/01621459.1985.10477157

关键词: AgreementContingency tableCategorical variableAssociation (object-oriented programming)Data miningKappaHierarchy (mathematics)Computer scienceVariety (cybernetics)Inter-rater reliability

摘要: Abstract An approach to the modeling of agreement among raters is proposed. By examining a hierarchy log-linear models, it shown how one can analyze in manner analogous analysis association contingency table. Specific attention given problems K-rater and between several observers standard. Examples are used illustrate this provides general framework for variety problem situations.

参考文章(23)
Martin W. Birch, Maximum Likelihood in Three-Way Contingency Tables Springer Series in Statistics. ,vol. 25, pp. 462- 477 ,(1992) , 10.1007/978-1-4612-4380-9_33
Leo A. Goodman, Some multiplicative models for the analysis of cross classified data Proceedings of the Sixth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Theory of Statistics. pp. 649- 696 ,(1972)
Barbara F. Ryan, Thomas A. Ryan, Brian L. Joiner, Minitab Student Handbook ,(1976)
Joseph L. Fleiss, Measuring nominal scale agreement among many raters. Psychological Bulletin. ,vol. 76, pp. 378- 382 ,(1971) , 10.1037/H0031619
Anthony J. Conger, Integration and generalization of kappas for multiple raters. Psychological Bulletin. ,vol. 88, pp. 322- 328 ,(1980) , 10.1037/0033-2909.88.2.322
Joseph L. Fleiss, Jacob Cohen, B. S. Everitt, Large sample standard errors of kappa and weighted kappa. Psychological Bulletin. ,vol. 72, pp. 323- 327 ,(1969) , 10.1037/H0028106
Richard J. Light, Measures of response agreement for qualitative data: Some generalizations and alternatives. Psychological Bulletin. ,vol. 76, pp. 365- 377 ,(1971) , 10.1037/H0031643