How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education

Research output: Contribution to journalArticlepeer-review

73 Scopus citations

Abstract

Context: Studies that investigate research questions that have already been resolved represent a waste of resources. However, the failure to collect sufficient evidence to resolve a given question results in ambiguity. Objectives: The present study was conducted to reanalyse the results of a meta-analysis of simulation-based education (SBE) to determine: (i) whether researchers continue to replicate research studies after the answer to a research question has become known, and (ii) whether researchers perform enough replications to definitively answer important questions. Methods: A systematic search of multiple databases to May 2011 was conducted to identify original research evaluating SBE for health professionals in comparison with no intervention or any active intervention, using skill outcomes. Data were extracted by reviewers working in duplicate. Data synthesis involved a cumulative meta-analysis to illuminate patterns of evidence by sequentially adding studies according to a variable of interest (e.g. publication year) and re-calculating the pooled effect size with each addition. Cumulative meta-analysis by publication year was applied to 592 comparative studies using several thresholds of 'sufficiency', including: statistical significance; stable effect size classification and magnitude (Hedges' g ± 0.1), and precise estimates (confidence intervals of less than ± 0.2). Results: Among studies that compared the outcomes of SBE with those of no intervention, evidence supporting a favourable effect of SBE on skills existed as early as 1973 (one publication) and further evidence confirmed a quantitatively large effect of SBE by 1997 (28 studies). Since then, a further 404 studies were published. Among studies comparing SBE with non-simulation instruction, the effect initially favoured non-simulation training, but the addition of a third study in 1997 brought the pooled effect to slightly favour simulation, and by 2004 (14 studies) this effect was statistically significant (p < 0.05) and the magnitude had stabilised (small effect). A further 37 studies were published after 2004. By contrast, evidence from studies evaluating repetition continued to show borderline statistical significance and wide confidence intervals in 2011. Conclusions: Some replication is necessary to obtain stable estimates of effect and to explore different contexts, but the number of studies of SBE often exceeds the minimum number of replications required.

Original languageEnglish (US)
Pages (from-to)750-760
Number of pages11
JournalMedical education
Volume48
Issue number8
DOIs
StatePublished - Aug 2014

ASJC Scopus subject areas

  • Education

Fingerprint

Dive into the research topics of 'How much evidence does it take? A cumulative meta-analysis of outcomes of simulation-based education'. Together they form a unique fingerprint.

Cite this