作者: Eric Steinberg , Ethan Cowan , Michelle Lin , Anthony Sielicki , Steven Warrington
DOI: 10.5811/WESTJEM.2020.3.46035
关键词:
摘要: Author(s): Steinberg, Eric; Cowan, Ethan; Lin, Michelle P.; Sielicki, Anthony; Warrington, Steven | Abstract: Introduction: A primary aim of residency training is to develop competence in clinical reasoning. However, there are few instruments that can accurately, reliably, and efficiently assess residents’ decision-making ability. This study aimed externally validate the script concordance test emergency medicine (SCT-EM), an assessment tool designed for this purpose.Methods: Using established methodology SCT-EM, we compared EM performance on SCT-EM expert panel physicians at three urban academic centers. We performed adjusted pairwise t-tests compare differences between all residents attending physicians, as well among resident postgraduate year (PGY) levels. tested correlation Accreditation Council Graduate Medical Education Milestone scores using Pearson’s coefficients. Inter-item covariances SCT items were calculated Cronbach’s alpha statistic.Results: The was administered 68 13 attendings. There a significant difference mean groups (mean + standard deviation: PGY-1 59 7; PGY-2 62 6; PGY-3 60 8; PGY-4 61 73 8 attendings, p l 0.01). Post hoc comparisons demonstrated only occurred each PGY level attendings (p 0.01 vs group). Performance Milestones not significantly correlated (r = 0.12, 0.35). Internal reliability exam determined alpha, which 0.67 examinees, 0.89 expert-only group.Conclusion: has limited utility reliably assessing reasoning residents. Although able differentiate ability faculty, it did levels, or correlate with scores. Furthermore, several limitations threaten validity suggesting further needed more diverse settings.