Got power? A systematic review of sample size adequacy in health professions education research

David A. Cook, Rose Hatala

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

Many education research studies employ small samples, which in turn lowers statistical power. We re-analyzed the results of a meta-analysis of simulation-based education to determine study power across a range of effect sizes, and the smallest effect that could be plausibly excluded. We systematically searched multiple databases through May 2011, and included all studies evaluating simulation-based education for health professionals in comparison with no intervention or another simulation intervention. Reviewers working in duplicate abstracted information to calculate standardized mean differences (SMD’s). We included 897 original research studies. Among the 627 no-intervention-comparison studies the median sample size was 25. Only two studies (0.3 %) had ≥80 % power to detect a small difference (SMD > 0.2 standard deviations) and 136 (22 %) had power to detect a large difference (SMD > 0.8). 110 no-intervention-comparison studies failed to find a statistically significant difference, but none excluded a small difference and only 47 (43 %) excluded a large difference. Among 297 studies comparing alternate simulation approaches the median sample size was 30. Only one study (0.3 %) had ≥80 % power to detect a small difference and 79 (27 %) had power to detect a large difference. Of the 128 studies that did not detect a statistically significant effect, 4 (3 %) excluded a small difference and 91 (71 %) excluded a large difference. In conclusion, most education research studies are powered only to detect effects of large magnitude. For most studies that do not reach statistical significance, the possibility of large and important differences still exists.

Original languageEnglish (US)
Pages (from-to)73-83
Number of pages11
JournalAdvances in Health Sciences Education
Volume20
Issue number1
DOIs
StatePublished - Mar 2014

Keywords

  • Cohen’s d
  • Comparative effectiveness research
  • Data interpretation, statistical
  • Medical education
  • Noninferiority trials
  • Research design

ASJC Scopus subject areas

  • Education

Fingerprint

Dive into the research topics of 'Got power? A systematic review of sample size adequacy in health professions education research'. Together they form a unique fingerprint.

Cite this