No CrossRef data available.
Published online by Cambridge University Press: 02 May 2019
Introduction: Competency-based medical education (CBME) relies on pragmatic assessment to inform trainee progression decisions. It is unclear whether face-to-face workplace-based assessment (WBA) scoring by faculty reflects their true perception of trainee competence, as many factors influence individual assessments. To better defend competence committee decisions, it is critical to understand how accurately WBAs reflect the faculty's honest perception of resident competence and entrustment. Methods: To best capture faculty perception of trainee competence, we created a periodic performance assessment (PPA) tool for anonymous faculty assessment of residents after repeated clinical interactions. PPA surveys were distributed to full-time EM faculty at a single Canadian FRCPC-EM training site. Faculty were asked to score residents on entrustable professional activities (EPAs) based on encounters over the previous 6-months, and were advised that all data would be anonymized. All WBA scores for FRCPC-EM residents (N = 21) were collected from the 6-months preceding PPA completion. Analysis compared paired WBA and PPA entrustment scores for an individual resident, faculty, and EPA using Wilcoxon Signed Ranks tests and Spearman correlations. Data were analyzed across faculty, EPAs, and both faculty and EPA. Results: About half (17/33) of all invited full-time EM faculty participated. Overall, anonymous PPAs had a significantly lower mean score compared to face-to-face WBAs (3.61-3.69 vs. 3.92-4.06, p < 0.001 for all) across all groupings. Individual WBAs had a low-moderate correlation with individual PPAs (rho = 0.44). When scores were averaged across 1) faculty or 2) EPA, there was an increase in correlation, but it remained moderate (rho = 0.53 and 0.54, respectively). When scores were averaged for an individual resident across 3) faculty and EPA, there was a strong correlation between WBA and PPA (rho = 0.86). Conclusion: There is only moderate correlation between an individual faculty's WBAs and their anonymous longitudinal entrustment for a given resident on a specific EPA. These results may signal caution when interpreting WBA scores in the context of high stakes decisions. Aggregated scores from multiple faculty and/or multiple EPAs substantially increased the correlation between WBA and PPA. These findings highlight the importance of using aggregated WBA scores across multiple assessors and EPA for high-stakes resident progression decisions, to minimize the noise and bias in individual assessment.