Improving participant feedback to continuing medical education presenters in internal medicine

A mixed-methods study

Christopher M. Wittich, Karen F. Mauck, Jayawant Mandrekar, Karol A. Gluth, Colin Patrick West, Scott C. Litin, Thomas J. Beckman

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

BACKGROUND: Feedback is essential for improving the skills of continuing medical education (CME) presenters. However, there has been little research on improving the quality of feedback to CME presenters. OBJECTIVES: To validate an instrument for generating balanced and behavior-specific feedback from a national cross-section of participants to presenters at a large internal medicine CME course. DESIGN, SETTING, AND PARTICIPANTS: A prospective, randomized validation study with qualitative data analysis that included all 317 participants at a Mayo Clinic internal medicine CME course in 2009. MEASUREMENTS: An 8-item (5-point Likert scales) CME faculty assessment enhanced study form (ESF) was designed based on literature and expert review. Course participants were randomized to a standard form, a generic study form (GSF), or the ESF. The dimensionality of instrument scores was determined using factor analysis to account for clustered data. Internal consistency and interrater reliabilities were calculated. Associations between overall feedback scores and presenter and presentation variables were identified using generalized estimating equations to account for multiple observations within talk and speaker combinations. Two raters reached consensus on qualitative themes and independently analyzed narrative entries for evidence of balanced and behavior- specific comments. RESULTS: Factor analysis of 5,241 evaluations revealed a uni-dimensional model for measuring CME presenter feedback. Overall internal consistency (Cronbach alpha=0.94) and internal consistency reliability (ICC range 0.88-0.95) were excellent. Feedback scores were associated with presenters' academic ranks (mean score): Instructor (4.12), Assistant Professor (4.38), Associate Professor (4.56), Professor (4.70) (p=0.046). Qualitative analysis revealed that the ESF generated the highest numbers of balanced comments (GSF=11, ESF=26; p=0.01) and behavior-specific comments (GSF=64, ESF=104; p=0.001). CONCLUSIONS: We describe a practical and validated method for generating balanced and behavior-specific feedback for CME presenters in internal medicine. Our simple method for prompting course participants to give balanced and behavior-specific comments may ultimately provide CME presenters with feedback for improving their presentations.

Original languageEnglish (US)
Pages (from-to)425-431
Number of pages7
JournalJournal of General Internal Medicine
Volume27
Issue number4
DOIs
StatePublished - Apr 2012

Fingerprint

Continuing Medical Education
Internal Medicine
Statistical Factor Analysis
Medical Faculties
Validation Studies

Keywords

  • Continuing medical education
  • Feedback
  • Internal medicine
  • Participants

ASJC Scopus subject areas

  • Internal Medicine

Cite this

Improving participant feedback to continuing medical education presenters in internal medicine : A mixed-methods study. / Wittich, Christopher M.; Mauck, Karen F.; Mandrekar, Jayawant; Gluth, Karol A.; West, Colin Patrick; Litin, Scott C.; Beckman, Thomas J.

In: Journal of General Internal Medicine, Vol. 27, No. 4, 04.2012, p. 425-431.

Research output: Contribution to journalArticle

Wittich, Christopher M. ; Mauck, Karen F. ; Mandrekar, Jayawant ; Gluth, Karol A. ; West, Colin Patrick ; Litin, Scott C. ; Beckman, Thomas J. / Improving participant feedback to continuing medical education presenters in internal medicine : A mixed-methods study. In: Journal of General Internal Medicine. 2012 ; Vol. 27, No. 4. pp. 425-431.
@article{420f674555264cf7a2c7a20564c6a1c0,
title = "Improving participant feedback to continuing medical education presenters in internal medicine: A mixed-methods study",
abstract = "BACKGROUND: Feedback is essential for improving the skills of continuing medical education (CME) presenters. However, there has been little research on improving the quality of feedback to CME presenters. OBJECTIVES: To validate an instrument for generating balanced and behavior-specific feedback from a national cross-section of participants to presenters at a large internal medicine CME course. DESIGN, SETTING, AND PARTICIPANTS: A prospective, randomized validation study with qualitative data analysis that included all 317 participants at a Mayo Clinic internal medicine CME course in 2009. MEASUREMENTS: An 8-item (5-point Likert scales) CME faculty assessment enhanced study form (ESF) was designed based on literature and expert review. Course participants were randomized to a standard form, a generic study form (GSF), or the ESF. The dimensionality of instrument scores was determined using factor analysis to account for clustered data. Internal consistency and interrater reliabilities were calculated. Associations between overall feedback scores and presenter and presentation variables were identified using generalized estimating equations to account for multiple observations within talk and speaker combinations. Two raters reached consensus on qualitative themes and independently analyzed narrative entries for evidence of balanced and behavior- specific comments. RESULTS: Factor analysis of 5,241 evaluations revealed a uni-dimensional model for measuring CME presenter feedback. Overall internal consistency (Cronbach alpha=0.94) and internal consistency reliability (ICC range 0.88-0.95) were excellent. Feedback scores were associated with presenters' academic ranks (mean score): Instructor (4.12), Assistant Professor (4.38), Associate Professor (4.56), Professor (4.70) (p=0.046). Qualitative analysis revealed that the ESF generated the highest numbers of balanced comments (GSF=11, ESF=26; p=0.01) and behavior-specific comments (GSF=64, ESF=104; p=0.001). CONCLUSIONS: We describe a practical and validated method for generating balanced and behavior-specific feedback for CME presenters in internal medicine. Our simple method for prompting course participants to give balanced and behavior-specific comments may ultimately provide CME presenters with feedback for improving their presentations.",
keywords = "Continuing medical education, Feedback, Internal medicine, Participants",
author = "Wittich, {Christopher M.} and Mauck, {Karen F.} and Jayawant Mandrekar and Gluth, {Karol A.} and West, {Colin Patrick} and Litin, {Scott C.} and Beckman, {Thomas J.}",
year = "2012",
month = "4",
doi = "10.1007/s11606-011-1894-3",
language = "English (US)",
volume = "27",
pages = "425--431",
journal = "Journal of General Internal Medicine",
issn = "0884-8734",
publisher = "Springer New York",
number = "4",

}

TY - JOUR

T1 - Improving participant feedback to continuing medical education presenters in internal medicine

T2 - A mixed-methods study

AU - Wittich, Christopher M.

AU - Mauck, Karen F.

AU - Mandrekar, Jayawant

AU - Gluth, Karol A.

AU - West, Colin Patrick

AU - Litin, Scott C.

AU - Beckman, Thomas J.

PY - 2012/4

Y1 - 2012/4

N2 - BACKGROUND: Feedback is essential for improving the skills of continuing medical education (CME) presenters. However, there has been little research on improving the quality of feedback to CME presenters. OBJECTIVES: To validate an instrument for generating balanced and behavior-specific feedback from a national cross-section of participants to presenters at a large internal medicine CME course. DESIGN, SETTING, AND PARTICIPANTS: A prospective, randomized validation study with qualitative data analysis that included all 317 participants at a Mayo Clinic internal medicine CME course in 2009. MEASUREMENTS: An 8-item (5-point Likert scales) CME faculty assessment enhanced study form (ESF) was designed based on literature and expert review. Course participants were randomized to a standard form, a generic study form (GSF), or the ESF. The dimensionality of instrument scores was determined using factor analysis to account for clustered data. Internal consistency and interrater reliabilities were calculated. Associations between overall feedback scores and presenter and presentation variables were identified using generalized estimating equations to account for multiple observations within talk and speaker combinations. Two raters reached consensus on qualitative themes and independently analyzed narrative entries for evidence of balanced and behavior- specific comments. RESULTS: Factor analysis of 5,241 evaluations revealed a uni-dimensional model for measuring CME presenter feedback. Overall internal consistency (Cronbach alpha=0.94) and internal consistency reliability (ICC range 0.88-0.95) were excellent. Feedback scores were associated with presenters' academic ranks (mean score): Instructor (4.12), Assistant Professor (4.38), Associate Professor (4.56), Professor (4.70) (p=0.046). Qualitative analysis revealed that the ESF generated the highest numbers of balanced comments (GSF=11, ESF=26; p=0.01) and behavior-specific comments (GSF=64, ESF=104; p=0.001). CONCLUSIONS: We describe a practical and validated method for generating balanced and behavior-specific feedback for CME presenters in internal medicine. Our simple method for prompting course participants to give balanced and behavior-specific comments may ultimately provide CME presenters with feedback for improving their presentations.

AB - BACKGROUND: Feedback is essential for improving the skills of continuing medical education (CME) presenters. However, there has been little research on improving the quality of feedback to CME presenters. OBJECTIVES: To validate an instrument for generating balanced and behavior-specific feedback from a national cross-section of participants to presenters at a large internal medicine CME course. DESIGN, SETTING, AND PARTICIPANTS: A prospective, randomized validation study with qualitative data analysis that included all 317 participants at a Mayo Clinic internal medicine CME course in 2009. MEASUREMENTS: An 8-item (5-point Likert scales) CME faculty assessment enhanced study form (ESF) was designed based on literature and expert review. Course participants were randomized to a standard form, a generic study form (GSF), or the ESF. The dimensionality of instrument scores was determined using factor analysis to account for clustered data. Internal consistency and interrater reliabilities were calculated. Associations between overall feedback scores and presenter and presentation variables were identified using generalized estimating equations to account for multiple observations within talk and speaker combinations. Two raters reached consensus on qualitative themes and independently analyzed narrative entries for evidence of balanced and behavior- specific comments. RESULTS: Factor analysis of 5,241 evaluations revealed a uni-dimensional model for measuring CME presenter feedback. Overall internal consistency (Cronbach alpha=0.94) and internal consistency reliability (ICC range 0.88-0.95) were excellent. Feedback scores were associated with presenters' academic ranks (mean score): Instructor (4.12), Assistant Professor (4.38), Associate Professor (4.56), Professor (4.70) (p=0.046). Qualitative analysis revealed that the ESF generated the highest numbers of balanced comments (GSF=11, ESF=26; p=0.01) and behavior-specific comments (GSF=64, ESF=104; p=0.001). CONCLUSIONS: We describe a practical and validated method for generating balanced and behavior-specific feedback for CME presenters in internal medicine. Our simple method for prompting course participants to give balanced and behavior-specific comments may ultimately provide CME presenters with feedback for improving their presentations.

KW - Continuing medical education

KW - Feedback

KW - Internal medicine

KW - Participants

UR - http://www.scopus.com/inward/record.url?scp=84862575129&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862575129&partnerID=8YFLogxK

U2 - 10.1007/s11606-011-1894-3

DO - 10.1007/s11606-011-1894-3

M3 - Article

VL - 27

SP - 425

EP - 431

JO - Journal of General Internal Medicine

JF - Journal of General Internal Medicine

SN - 0884-8734

IS - 4

ER -