Preliminary Report Assessing Resident and Faculty Evaluations Across Radiation Oncology Programs

G. Rajeev-Kumar, R. Manjunath, M. M. Harkenrider, P. Das, R. D. Tendulkar, K. S. Corbin, R. Jagsi, G. Marwaha, K. L. Johung, S. B. Evans, M. S. Jawad, E. T. Shinohara, L. T. Komarnicky, A. R. Yeung, M. Buckstein, D. W. Golden, Y. Hasan

Research output: Contribution to journalArticlepeer-review


PURPOSE/OBJECTIVE(S): Across training programs, resident evaluations are based on the Accreditation Council for Graduate Medical Education's (ACGME) Six Core Competencies: practice-based learning and improvement (PBLI), patient care (PC), systems-based practice (SBP), medical knowledge (MK), interpersonal communication skills (ICS), and professionalism (PR). With few studies on resident assessment within radiation oncology (RO), we analyze characteristics of the existing assessment methods. We also analyze factors involved in the evaluations of faculty by residents. We hypothesize that both faculty and resident evaluations vary significantly in length and criteria used. MATERIALS/METHODS: Twelve academic RO residency programs provided evaluation forms: faculty resident evaluation (FRE), n = 12, and resident faculty evaluation (RFE), n = 11. Data on the frequency and types of questions were collected. Questions were coded into 9 categories for RFE: teaching skills, patient care, personal qualities (approachability, responsiveness, demeanor and attitude), result of rotation (assessing the degree to which faculty increased knowledge, desire to learn and independent inquiry by end of rotation), knowledge, mentoring skills, learning climate, research, and communication. Analysis-of-variance (ANOVA) was used to determine differences between institutions and between categories. RESULTS: Across all institutions, FRE was based on the Six Core Competencies with an average of 19 questions (standard deviation (SD) 11, range 5-47) in total. PR had the most questions (mean 3.7, SD 2.9) followed by PC (mean 3.3, SD 2.8). SBP and PBLI had the fewest questions. ANOVA did not show significant variation in the number of questions between the categories (F = 0.78, P = 0.6). There was a significant difference in the mean number of questions used for assessing the competencies across institutions (F = 6.6, P < 0.01). RFE varied in length, formatting, and content of evaluations across institutions (F = 7.8, P < 0.01). Teaching and personal qualities were evaluated the most with 9/11 institutions posing ≥1 question about these factors. The mean +/- SD for RFE questions per category was personal quality 4.3 +/- 3.1, teaching skills 3.9 +/- 2.0, communication 3.7 +/- 3.0, result of rotation 3.7 +/- 2.1, knowledge 3.5 +/- 2.6, patient care 2.7 +/- 1.5, mentoring skills 2.7 +/- 1.5, learning climate 2.0 +/- 0.6, research 2.0 +/- 1.0. CONCLUSION: FRE is primarily based on ACGME core competencies for RO. However, RFE varies by institution. Standardized criteria should be developed for RFE to obtain feedback that can be used to improve residency programs. This study is part of a larger project collecting resident and faculty perspectives with the goal to ultimately develop consensus FRE and RFE recommendations.

Original languageEnglish (US)
Pages (from-to)e191
JournalInternational journal of radiation oncology, biology, physics
Issue number3
StatePublished - Nov 1 2021

ASJC Scopus subject areas

  • Radiation
  • Oncology
  • Radiology Nuclear Medicine and imaging
  • Cancer Research


Dive into the research topics of 'Preliminary Report Assessing Resident and Faculty Evaluations Across Radiation Oncology Programs'. Together they form a unique fingerprint.

Cite this