TY - JOUR
T1 - Patient outcomes in simulation-based medical education
T2 - A systematic review
AU - Zendejas, Benjamin
AU - Brydges, Ryan
AU - Wang, Amy T.
AU - Cook, David A.
N1 - Funding Information:
Acknowledgements: Contributors: The authors thank Rose Hatala, MD, MSc, Stanley J. Hamstra, Phd, Jason H. Szostek, MD, and Patricia J. Erwin, MLS, for their assistance in the literature search and initial data acquisition Funders: This work was supported by intramural funds, including an award from the Division of General Internal Medicine, Mayo Clinic. The funding sources for this study played no role in the design and conduct of the study; in the collection, management, analysis, and interpretation of the data; or in the preparation of the manuscript. The funding sources did not review the manuscript.
PY - 2013/8
Y1 - 2013/8
N2 - OBJECTIVES: Evaluating the patient impact of health professions education is a societal priority with many challenges. Researchers would benefit from a summary of topics studied and potential methodological problems. We sought to summarize key information on patient outcomes identified in a comprehensive systematic review of simulation-based instruction. DATA SOURCES: Systematic search of MEDLINE, EMBASE, CINAHL, PsychINFO, Scopus, key journals, and bibliographies of previous reviews through May 2011. STUDY ELIGIBILITY: Original research in any language measuring the direct effects on patients of simulation-based instruction for health professionals, in comparison with no intervention or other instruction. APPRAISAL and SYNTHESIS: Two reviewers independently abstracted information on learners, topics, study quality including unit of analysis, and validity evidence. We pooled outcomes using random effects. RESULTS: From 10,903 articles screened, we identified 50 studies reporting patient outcomes for at least 3,221 trainees and 16,742 patients. Clinical topics included airway management (14 studies), gastrointestinal endoscopy (12), and central venous catheter insertion (8). There were 31 studies involving postgraduate physicians and seven studies each involving practicing physicians, nurses, and emergency medicine technicians. Fourteen studies (28 %) used an appropriate unit of analysis. Measurement validity was supported in seven studies reporting content evidence, three reporting internal structure, and three reporting relations with other variables. The pooled Hedges' g effect size for 33 comparisons with no intervention was 0.47 (95 % confidence interval [CI], 0.31-0.63); and for nine comparisons with non-simulation instruction, it was 0.36 (95 % CI, -0.06 to 0.78). LIMITATIONS: Focused field in education; high inconsistency (I2 > 50 % in most analyses). CONCLUSIONS: Simulation-based education was associated with small-moderate patient benefits in comparison with no intervention and non-simulation instruction, although the latter did not reach statistical significance. Unit of analysis errors were common, and validity evidence was infrequently reported.
AB - OBJECTIVES: Evaluating the patient impact of health professions education is a societal priority with many challenges. Researchers would benefit from a summary of topics studied and potential methodological problems. We sought to summarize key information on patient outcomes identified in a comprehensive systematic review of simulation-based instruction. DATA SOURCES: Systematic search of MEDLINE, EMBASE, CINAHL, PsychINFO, Scopus, key journals, and bibliographies of previous reviews through May 2011. STUDY ELIGIBILITY: Original research in any language measuring the direct effects on patients of simulation-based instruction for health professionals, in comparison with no intervention or other instruction. APPRAISAL and SYNTHESIS: Two reviewers independently abstracted information on learners, topics, study quality including unit of analysis, and validity evidence. We pooled outcomes using random effects. RESULTS: From 10,903 articles screened, we identified 50 studies reporting patient outcomes for at least 3,221 trainees and 16,742 patients. Clinical topics included airway management (14 studies), gastrointestinal endoscopy (12), and central venous catheter insertion (8). There were 31 studies involving postgraduate physicians and seven studies each involving practicing physicians, nurses, and emergency medicine technicians. Fourteen studies (28 %) used an appropriate unit of analysis. Measurement validity was supported in seven studies reporting content evidence, three reporting internal structure, and three reporting relations with other variables. The pooled Hedges' g effect size for 33 comparisons with no intervention was 0.47 (95 % confidence interval [CI], 0.31-0.63); and for nine comparisons with non-simulation instruction, it was 0.36 (95 % CI, -0.06 to 0.78). LIMITATIONS: Focused field in education; high inconsistency (I2 > 50 % in most analyses). CONCLUSIONS: Simulation-based education was associated with small-moderate patient benefits in comparison with no intervention and non-simulation instruction, although the latter did not reach statistical significance. Unit of analysis errors were common, and validity evidence was infrequently reported.
KW - educational technology
KW - medical education
KW - outcomes research
KW - program evaluation
KW - quantitative research methods
KW - simulation
UR - http://www.scopus.com/inward/record.url?scp=84880509496&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84880509496&partnerID=8YFLogxK
U2 - 10.1007/s11606-012-2264-5
DO - 10.1007/s11606-012-2264-5
M3 - Review article
C2 - 23595919
AN - SCOPUS:84880509496
SN - 0884-8734
VL - 28
SP - 1078
EP - 1089
JO - Journal of General Internal Medicine
JF - Journal of General Internal Medicine
IS - 8
ER -