TY - JOUR
T1 - Technology-enhanced simulation to assess health professionals
T2 - A systematic review of validity evidence, research methods, and reporting quality
AU - Cook, David A.
AU - Brydges, Ryan
AU - Zendejas, Benjamin
AU - Hamstra, Stanley J.
AU - Hatala, Rose
N1 - Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2013/6
Y1 - 2013/6
N2 - Purpose: To summarize the tool characteristics, sources of validity evidence, methodological quality, and reporting quality for studies of technology-enhanced simulation-based assessments for health professions learners. Method: The authors conducted a systematic review, searching MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous reviews through May 2011. They selected original research in any language evaluating simulation-based assessment of practicing and student physicians, nurses, and other health professionals. Reviewers working in duplicate evaluated validity evidence using Messick's five-source framework; methodological quality using the Medical Education Research Study Quality Instrument and the revised Quality Assessment of Diagnostic Accuracy Studies; and reporting quality using the Standards for Reporting Diagnostic Accuracy and Guidelines for Reporting Reliability and Agreement Studies. Results: Of 417 studies, 350 (84%) involved physicians at some stage in training. Most focused on procedural skills, including minimally invasive surgery (N = 142), open surgery (81), and endoscopy (67). Common elements of validity evidence included relations with trainee experience (N = 306), content (142), relations with other measures (128), and interrater reliability (124). Of the 217 studies reporting more than one element of evidence, most were judged as having high or unclear risk of bias due to selective sampling (N = 192) or test procedures (132). Only 64% proposed a plan for interpreting the evidence to be presented (validity argument). Conclusions: Validity evidence for simulation-based assessments is sparse and is concentrated within specific specialties, tools, and sources of validity evidence. The methodological and reporting quality of assessment studies leaves much room for improvement.
AB - Purpose: To summarize the tool characteristics, sources of validity evidence, methodological quality, and reporting quality for studies of technology-enhanced simulation-based assessments for health professions learners. Method: The authors conducted a systematic review, searching MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous reviews through May 2011. They selected original research in any language evaluating simulation-based assessment of practicing and student physicians, nurses, and other health professionals. Reviewers working in duplicate evaluated validity evidence using Messick's five-source framework; methodological quality using the Medical Education Research Study Quality Instrument and the revised Quality Assessment of Diagnostic Accuracy Studies; and reporting quality using the Standards for Reporting Diagnostic Accuracy and Guidelines for Reporting Reliability and Agreement Studies. Results: Of 417 studies, 350 (84%) involved physicians at some stage in training. Most focused on procedural skills, including minimally invasive surgery (N = 142), open surgery (81), and endoscopy (67). Common elements of validity evidence included relations with trainee experience (N = 306), content (142), relations with other measures (128), and interrater reliability (124). Of the 217 studies reporting more than one element of evidence, most were judged as having high or unclear risk of bias due to selective sampling (N = 192) or test procedures (132). Only 64% proposed a plan for interpreting the evidence to be presented (validity argument). Conclusions: Validity evidence for simulation-based assessments is sparse and is concentrated within specific specialties, tools, and sources of validity evidence. The methodological and reporting quality of assessment studies leaves much room for improvement.
UR - http://www.scopus.com/inward/record.url?scp=84878950792&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84878950792&partnerID=8YFLogxK
U2 - 10.1097/ACM.0b013e31828ffdcf
DO - 10.1097/ACM.0b013e31828ffdcf
M3 - Review article
C2 - 23619073
AN - SCOPUS:84878950792
SN - 1040-2446
VL - 88
SP - 872
EP - 883
JO - Academic Medicine
JF - Academic Medicine
IS - 6
ER -