作者: Eisha Wali , Jayant M. Pinto , Melissa Cappaert , Marcie Lambrix , Angela D. Blood
DOI: 10.1016/J.SURG.2016.03.026
关键词:
摘要: Background We systematically reviewed the literature concerning simulation-based teaching and assessment of Accreditation Council for Graduate Medical Education professionalism competencies to elucidate best practices facilitate further research. Methods A systematic review English “professionalism” “simulation(s)” yielded 697 abstracts. Two independent raters chose abstracts that (1) focused on graduate medical education, (2) described simulation method, (3) used train or assess professionalism. Fifty met criteria, seven were excluded lack relevant information. The raters, 6 professionals with simulation, clinical experience, discussed 5 these articles as a group; they calibrated coding applied refinements, resulting in final, iteratively developed evaluation form. then divided into 2 teams read remaining articles. Overall, 15 eliminated, 28 underwent final analysis. Results Papers addressed heterogeneous range content via multiple methods. Common specialties represented surgery (46.4%), pediatrics (17.9%), emergency medicine (14.3%). Sixteen (57%) referenced framework; 14 (50%) incorporated an tool; 17 (60.7%) reported debriefing participants, though limited detail. Twenty-three (82.1%) evaluated programs, mostly using subjective trainee reports. Conclusion Despite early innovation, reporting training is nonstandardized methods terminology lacks details required replication. offer minimum standards future professionalism-focused well basic framework better mapping proper targeted domain