What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment

David A. Cook, Benjamin Zendejas, Stanley J. Hamstra, Rose Hatala, Ryan Brydges

Research output: Contribution to journalReview article

121 Scopus citations

Abstract

Ongoing transformations in health professions education underscore the need for valid and reliable assessment. The current standard for assessment validation requires evidence from five sources: content, response process, internal structure, relations with other variables, and consequences. However, researchers remain uncertain regarding the types of data that contribute to each evidence source. We sought to enumerate the validity evidence sources and supporting data elements for assessments using technology-enhanced simulation. We conducted a systematic literature search including MEDLINE, ERIC, and Scopus through May 2011. We included original research that evaluated the validity of simulation-based assessment scores using two or more evidence sources. Working in duplicate, we abstracted information on the prevalence of each evidence source and the underlying data elements. Among 217 eligible studies only six (3 %) referenced the five-source framework, and 51 (24 %) made no reference to any validity framework. The most common evidence sources and data elements were: relations with other variables (94 % of studies; reported most often as variation in simulator scores across training levels), internal structure (76 %; supported by reliability data or item analysis), and content (63 %; reported as expert panels or modification of existing instruments). Evidence of response process and consequences were each present in <10 % of studies. We conclude that relations with training level appear to be overrepresented in this field, while evidence of consequences and response process are infrequently reported. Validation science will be improved as educators use established frameworks to collect and interpret evidence from the full spectrum of possible sources and elements.

Original languageEnglish (US)
Pages (from-to)233-250
Number of pages18
JournalAdvances in Health Sciences Education
Volume19
Issue number2
DOIs
StatePublished - May 2014

Keywords

  • Assessment
  • Educational technology
  • Evaluation
  • Medical education
  • Methods quantitative
  • Reliability
  • Reporting quality
  • Simulation
  • Validity

ASJC Scopus subject areas

  • Education

Fingerprint Dive into the research topics of 'What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment'. Together they form a unique fingerprint.

  • Cite this