Development and Pilot Testing of a Programmatic System for Competency Assessment in US Anesthesiology Residency Training.
Journal
Anesthesia and analgesia
ISSN: 1526-7598
Titre abrégé: Anesth Analg
Pays: United States
ID NLM: 1310650
Informations de publication
Date de publication:
06 Oct 2023
06 Oct 2023
Historique:
medline:
6
10
2023
pubmed:
6
10
2023
entrez:
6
10
2023
Statut:
aheadofprint
Résumé
In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments. Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC). New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation. A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.
Sections du résumé
BACKGROUND
BACKGROUND
In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments.
METHODS
METHODS
Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC).
RESULTS
RESULTS
New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation.
CONCLUSIONS
CONCLUSIONS
A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.
Identifiants
pubmed: 37801598
doi: 10.1213/ANE.0000000000006667
pii: 00000539-990000000-00639
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Informations de copyright
Copyright © 2023 International Anesthesia Research Society.
Déclaration de conflit d'intérêts
The authors declare no conflicts of interest.
Références
Ten Cate O. Competency-based postgraduate medical education: past, present and future. GMS J Med Educ. 2017;34(5):Doc69.
Frank JR, Snell LS, ten Cate O, Holmboe ES, Carracio C, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32:638–645.
Nasca TJ, Philbert I, Brigham T, Flynn TC. The next GME accreditation system––rationale and benefits. N Engl J Med. 2012;366:1051–1056.
The anesthesiology milestone project. J Grad Med Educ. 2014;6(suppl 1):15–28.
Holmboe ES, Laura E, Stan H. The Milestones Guidebook. Accreditation Council for Graduate Medical Education; 2016.
Holmboe ES, Yamazaki K, Edgar L, et al. Reflections on the first 2 years of milestone implementation. J Grad Med Educ. 2015;7:506–511.
Norman G, Norcini J, Bordage G. Competency-based education milestones or millstones? J Grad Med Educ. 2014;6:1–6.
McQueen SA, Petrisor B, Bhandari M, Fahim C, McKinnon V, Sonnadara RR. Examining the barriers to meaningful assessment and feedback in medical training. Am J Surg. 2016;211:464–475.
Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158.
O’Dowd E, Lydon S, O’Conner P, Madden C, Byrne D. A systematic review of 7-years of research on entrustable professional activities in graduate medical education, 2011-2018. Med Educ. 2019;53:234–249.
Woodworth GE, Marty AP, Tanaka PP, et al. Development and pilot testing of entrustable professional activities for US anesthesiology residency training. Anesth Analg. 2021;132:1579–1591.
Choe JH, Knight CL, Stilling R, Corning K, Lock K, Steinberg KP. Shortening the miles to the milestones: connecting EPA-based evaluations to ACGME milestone reports for internal medicine residency programs. Acad Med. 2016;91:943–950.
Carracio C, Englander R, Gilhooly J, et al. Building a framework of entrustable professional activities, supported by competencies and milestones, to bridge the educational continuum. Acad Med. 2017;92:324–330.
Perry M, Linn A, Munzer BW, et al. Programmatic assessment in emergency medicine: implementation of best practices. J Grad Med Educ. 2018;10:84–90.
Gardner AK, Scott DJ, Choti MA, Mansour JC. Developing a comprehensive resident education evaluation system in the era of milestone assessment. J Surg Educ. 2015;72:618–624.
Lobst WF, Holmboe ES. Programmatic assessment: the secret sauce of effective CBME implementation. J Grad Med Educ. 2020;12:518–512.
Ambardekar AP, Walker KK, McKenzie-Brwon AM, et al. The Anesthesiology Milestones 2.0: an improved competency-based assessment for residency training. Anesth Analg. 2021;133:353–361.
Robinson JG, Wagner N, Szulewski A, Dudek N, Cheung W, et al. Exploring the use of rating scales with entrustment anchors in workplace-based assessment. Med Educ. 2021;55:1047–1055.
Swing SR. Assessing the ACGME general competencies: general considerations and assessment methods. Acad Emerg Med. 2002;9:1278–1288.
Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:206–214.
Cook DA, Brydges R, Ginsburg S, Hatala R. A contemporary approach to validity arguments: a practical guide to Kane’s framework. Med Educ. 2015;49:560–575.
Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94:1002–1009.
Schuwirth LW, van der Vleuten CP, Durning SJ. What programmatic assessment in medical education can learn from healthcare. Perspect Med Educ. 2017;6:211–215.
Schuwirth LW, van der Vleuten CP. Programmatic assessment: from assessment for learning to assessment of learning. Med Teach. 2011;33:478–485.
Pearce J, Prideaux D. When I say…programmatic assessment in postgraduate medical education. Med Educ. 2019;53:1074–1076.
Watling CJ, Ginsburg S. Assessment, feedback, and the alchemy of learning. Med Educ. 2019;53:76–85.
Ross S, Hauer KE, Wycliffe-Jones K, et al.; ICBME Collaborators. Key considerations in planning and designing programmatic assessment in competency-based medical education. Med Teach. 2021;43:758–764.
Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32:676–682.
Lucey CR, Thibault GE, ten Cate O. Competency-based, time-variable education in health professions: crossroads. Acad Med. 2018;93:S1–S5.
Marty AP, Braun J, Schick C, Zalunardo MP, Spahn DR, Breckwoldt J. A mobile application to facilitate implementation of programmatic assessment in anaesthesia training. Br J Anaesth. 2022;128:990–996.
Dickey CC, Thomas C, Feroze U, Nakshabandi F, Canno B. Cognitive demands and bias: challenges facing clinical competency committees. J Grad Med Educ. 2017;9:162–164.
Eva KW, Hodges BD. Scylla or Charybdis? Can we navigate between objectification and judgment in assessment? Med Edu. 2012;46:914–919.
Isaak RS, Chen F, Martinelli SM, et al. Validity of simulation-based assessment for Accreditation Council for Graduate Medical Education milestone achievement. Simul Healthc. 2018;13:201–210.
Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of the objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res. 1996;63:225–230.
Waltz M, Davis A, Cadigan R, et al. Professionalism and ethics: a standardized patient observed standardized clinical examination to assess ACGME pediatric professionalism milestones. MedEdPORTAL. 2020;16:10873.
Sinz E. Simulation for anesthesiology milestones. Int Anesthesiol Clin. 2015;53:23–41.
Accreditation Council for Graduate Medical Education. Assessment guidebook. Accessed July 11, 2022. Available at: https://www.acgme.org/globalassets/milestonesguidebook.pdf.
Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med. 2009;84:301–309.
Beeson MS, Hamstra SJ, Barton MA, et al. Straight-line scoring by clinical competency committees using emergency medicine milestones. J Grad Med Educ. 2017;9:716–720.
Goldhamer MEJ, Martinez-Lage M, Black-Schaffer WS, et al. Reimagining the clinical competency committee to enhance education and prepare for competency-based time-variable training. J Gen Intern Med. 2022;37:2280–2290.