If you teach them, they will learn: why medical education needs comparative effectiveness research

作者: David A. Cook

DOI: 10.1007/S10459-012-9381-0

关键词:

摘要: If you teach a medical student, can they learn? The answer may seem self-evident. After all, undergraduates don’t make it into school without demonstrating remarkable capacity to learn and perform well on tests. So asking if students (or other health professionals students) are capable of learning should be superfluous question. Yet education researchers compelled repeatedly ask this And surprisingly not), come up with the same answer. Figure 1 shows results over 750 studies, summarized from 4 separate metaanalyses (Cook et al. 2010a, 2008b, 2011a, McGaghie 2011), comparing various forms training no intervention. For example, meta-analysis Internet-based found 126 studies intervention (either single-group pretest–posttest study, or no-intervention comparison group; Cook 2008b). Only 2 failed favor group for outcomes knowledge, average effect size was 1.0—which according Cohen (1988) would considered large effect. Results were similarly skills behaviors. Another similar computer-based virtual patients 2010a). Most recently, two meta-analyses simulation-based confirmed strong benefits, sizes ranging 0.8–2.0 2011a; 2011). Moreover, these held true across learner subgroups (medical students, postgraduate physician trainees, physicians, nurses nursing others), study designs (there 150 randomized trials), multiple subgroup analyses. when actual impact lower noted, then still moderately-large 0.50. Even after adjusting possible publication bias

参考文章(28)
David A Cook, Anthony J Levinson, Sarah Garside, Method and reporting quality in health professions education research: a systematic review Medical Education. ,vol. 45, pp. 227- 238 ,(2011) , 10.1111/J.1365-2923.2010.03890.X
William C. McGaghie, S. Barry Issenberg, Elaine R. Cohen, Jeffrey H. Barsuk, Diane B. Wayne, Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence Academic Medicine. ,vol. 86, pp. 706- 711 ,(2011) , 10.1097/ACM.0B013E318217E119
Benjamin Zendejas, Amy T. Wang, Ryan Brydges, Stanley J. Hamstra, David A. Cook, Cost: the missing outcome in simulation-based medical education research: a systematic review. Surgery. ,vol. 153, pp. 160- 176 ,(2013) , 10.1016/J.SURG.2012.06.025
C P Friedman, The research we should be doing Academic Medicine. ,vol. 69, pp. 455- 457 ,(1994) , 10.1097/00001888-199406000-00005
Michael Hochman, Danny McCormick, Characteristics of Published Comparative Effectiveness Studies of Medications JAMA. ,vol. 303, pp. 951- 958 ,(2010) , 10.1001/JAMA.2010.240
David A Cook, Avoiding confounded comparisons in education research. Medical Education. ,vol. 43, pp. 102- 104 ,(2009) , 10.1111/J.1365-2923.2008.03263.X
David A. Cook, Patricia J. Erwin, Marc M. Triola, Computerized virtual patients in health professions education: a systematic review and meta-analysis. Academic Medicine. ,vol. 85, pp. 1589- 1602 ,(2010) , 10.1097/ACM.0B013E3181EDFE13