K. Y. Hu1, J. P. Dux1, P. N. Redlich2, R. W. Treat3, T. B. Krausert2, M. J. Malinowski2 1Medical College Of Wisconsin,Department Of Surgery,Milwaukee, WI, USA 2Medical College Of Wisconsin,Division Of Education, Department Of Surgery,Milwaukee, WI, USA 3Medical College Of Wisconsin,Department Of Academic Affairs,Milwaukee, WI, USA
Introduction:
Traditional surgical resident education is based on didactic curriculums with performance gains reliably assessed by time-honored multiple-choice question (MCQ) exams. This assessment tool is common for junior residents, then transitions to clinical scenario-based teaching with oral competency examinations (OCE) for senior residents. Standardized oral examinations during residency have been reported to significantly improve certifying examination pass rates; however, limited information exists on the impact of oral examinations at the junior resident level. We hypothesized that junior residents would report improved confidence in their clinical performance and increased satisfaction with inclusion of OCE compared to traditional written post-test evaluations following didactic lectures.
Methods:
We modified our PGY-1 protected block curriculum in June 2016 to include OCE while maintaining the traditional post-test MCQ exams. In each curriculum block, residents were assessed with OCE consisting of two clinical scenarios over 16 minutes in front of an audience of five to seven peers, covering topics addressed in the curriculum’s didactic sessions. At the end of each academic year (2016-2017 and 2017-2018), participants were asked to complete a survey rating the perceived impact of OCE using a combination of 5-point Likert scales (1=poor, 5=excellent) and dichotomous responses (yes/no). Analysis was generated with IBM® SPSS® 24.0.
Results:
Of the 24 PGY-1 residents (12 in each academic year) who completed the voluntary survey, 91% thought the oral examination experience led to improved clinical performance in complex patient scenarios. Residents perceived that OCE improved their understanding of surgical indications (71%), preoperative work-up (88%), postoperative care (83%), and surgical complications (88%). The majority of residents (88%) rated the quality of teaching during OCE to be good or excellent, correlating with 92% who found benefit in observing their colleagues being examined (Spearman rho=0.6, p=0.002). Overall, 87% of residents thought OCE served as a better review of didactic materials than a written exam, strongly correlating with those who thought OCE provided good or excellent value as an educational activity (88%, r=0.7, p<0.001). The dynamic quality of teaching during OCE was also significantly correlated to its value as an educational activity (r=0.6, p=0.001), as well as to the improved review of materials (r=0.6, p=0.001).
Conclusion:
Assessment using OCE during PGY-1 curriculum sessions coupled with peer observation has additional educational value and adds enhanced confidence in clinical performance compared to traditional MCQ testing. Further study is warranted on the impact of OCE on in-training exam scores and senior resident mock oral board examination performance.