A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews

Tianjing Li, Ian J. Saldanha, Jens Jap, Bryant T. Smith, Joseph Canner, Susan M. Hutfless, Vernal Branch, Simona Carini, Wiley Chan, Berry de Bruijn, Byron C. Wallace, Sandra A. Walsh, Elizabeth J. Whamond, Mohammad H Murad, Ida Sim, Jesse A. Berlin, Joseph Lau, Kay Dickersin, Christopher H. Schmid

Research output: Contribution to journalArticle

Abstract

Objectives: Data Abstraction Assistant (DAA) is a software for linking items abstracted into a data collection form for a systematic review to their locations in a study report. We conducted a randomized cross-over trial that compared DAA-facilitated single-data abstraction plus verification (“DAA verification”), single data abstraction plus verification (“regular verification”), and independent dual data abstraction plus adjudication (“independent abstraction”). Study Design and Setting: This study is an online randomized cross-over trial with 26 pairs of data abstractors. Each pair abstracted data from six articles, two per approach. Outcomes were the proportion of errors and time taken. Results: Overall proportion of errors was 17% for DAA verification, 16% for regular verification, and 15% for independent abstraction. DAA verification was associated with higher odds of errors when compared with regular verification (adjusted odds ratio [OR] = 1.08; 95% confidence interval [CI]: 0.99–1.17) or independent abstraction (adjusted OR = 1.12; 95% CI: 1.03–1.22). For each article, DAA verification took 20 minutes (95% CI: 1–40) longer than regular verification, but 46 minutes (95% CI: 26 to 66) shorter than independent abstraction. Conclusion: Independent abstraction may only be necessary for complex data items. DAA provides an audit trail that is crucial for reproducible research.

Original languageEnglish (US)
Pages (from-to)77-89
Number of pages13
JournalJournal of Clinical Epidemiology
Volume115
DOIs
StatePublished - Nov 1 2019

Fingerprint

Confidence Intervals
Cross-Over Studies
Odds Ratio
Software
Research

Keywords

  • Accuracy
  • Data abstraction
  • Efficiency
  • Randomized cross-over trial
  • Software application
  • Systematic review

ASJC Scopus subject areas

  • Epidemiology

Cite this

A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews. / Li, Tianjing; Saldanha, Ian J.; Jap, Jens; Smith, Bryant T.; Canner, Joseph; Hutfless, Susan M.; Branch, Vernal; Carini, Simona; Chan, Wiley; de Bruijn, Berry; Wallace, Byron C.; Walsh, Sandra A.; Whamond, Elizabeth J.; Murad, Mohammad H; Sim, Ida; Berlin, Jesse A.; Lau, Joseph; Dickersin, Kay; Schmid, Christopher H.

In: Journal of Clinical Epidemiology, Vol. 115, 01.11.2019, p. 77-89.

Research output: Contribution to journalArticle

Li, T, Saldanha, IJ, Jap, J, Smith, BT, Canner, J, Hutfless, SM, Branch, V, Carini, S, Chan, W, de Bruijn, B, Wallace, BC, Walsh, SA, Whamond, EJ, Murad, MH, Sim, I, Berlin, JA, Lau, J, Dickersin, K & Schmid, CH 2019, 'A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews', Journal of Clinical Epidemiology, vol. 115, pp. 77-89. https://doi.org/10.1016/j.jclinepi.2019.07.005
Li, Tianjing ; Saldanha, Ian J. ; Jap, Jens ; Smith, Bryant T. ; Canner, Joseph ; Hutfless, Susan M. ; Branch, Vernal ; Carini, Simona ; Chan, Wiley ; de Bruijn, Berry ; Wallace, Byron C. ; Walsh, Sandra A. ; Whamond, Elizabeth J. ; Murad, Mohammad H ; Sim, Ida ; Berlin, Jesse A. ; Lau, Joseph ; Dickersin, Kay ; Schmid, Christopher H. / A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews. In: Journal of Clinical Epidemiology. 2019 ; Vol. 115. pp. 77-89.
@article{a4da1b7b58544e57832f0695f60501d7,
title = "A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews",
abstract = "Objectives: Data Abstraction Assistant (DAA) is a software for linking items abstracted into a data collection form for a systematic review to their locations in a study report. We conducted a randomized cross-over trial that compared DAA-facilitated single-data abstraction plus verification (“DAA verification”), single data abstraction plus verification (“regular verification”), and independent dual data abstraction plus adjudication (“independent abstraction”). Study Design and Setting: This study is an online randomized cross-over trial with 26 pairs of data abstractors. Each pair abstracted data from six articles, two per approach. Outcomes were the proportion of errors and time taken. Results: Overall proportion of errors was 17{\%} for DAA verification, 16{\%} for regular verification, and 15{\%} for independent abstraction. DAA verification was associated with higher odds of errors when compared with regular verification (adjusted odds ratio [OR] = 1.08; 95{\%} confidence interval [CI]: 0.99–1.17) or independent abstraction (adjusted OR = 1.12; 95{\%} CI: 1.03–1.22). For each article, DAA verification took 20 minutes (95{\%} CI: 1–40) longer than regular verification, but 46 minutes (95{\%} CI: 26 to 66) shorter than independent abstraction. Conclusion: Independent abstraction may only be necessary for complex data items. DAA provides an audit trail that is crucial for reproducible research.",
keywords = "Accuracy, Data abstraction, Efficiency, Randomized cross-over trial, Software application, Systematic review",
author = "Tianjing Li and Saldanha, {Ian J.} and Jens Jap and Smith, {Bryant T.} and Joseph Canner and Hutfless, {Susan M.} and Vernal Branch and Simona Carini and Wiley Chan and {de Bruijn}, Berry and Wallace, {Byron C.} and Walsh, {Sandra A.} and Whamond, {Elizabeth J.} and Murad, {Mohammad H} and Ida Sim and Berlin, {Jesse A.} and Joseph Lau and Kay Dickersin and Schmid, {Christopher H.}",
year = "2019",
month = "11",
day = "1",
doi = "10.1016/j.jclinepi.2019.07.005",
language = "English (US)",
volume = "115",
pages = "77--89",
journal = "Journal of Clinical Epidemiology",
issn = "0895-4356",
publisher = "Elsevier USA",

}

TY - JOUR

T1 - A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews

AU - Li, Tianjing

AU - Saldanha, Ian J.

AU - Jap, Jens

AU - Smith, Bryant T.

AU - Canner, Joseph

AU - Hutfless, Susan M.

AU - Branch, Vernal

AU - Carini, Simona

AU - Chan, Wiley

AU - de Bruijn, Berry

AU - Wallace, Byron C.

AU - Walsh, Sandra A.

AU - Whamond, Elizabeth J.

AU - Murad, Mohammad H

AU - Sim, Ida

AU - Berlin, Jesse A.

AU - Lau, Joseph

AU - Dickersin, Kay

AU - Schmid, Christopher H.

PY - 2019/11/1

Y1 - 2019/11/1

N2 - Objectives: Data Abstraction Assistant (DAA) is a software for linking items abstracted into a data collection form for a systematic review to their locations in a study report. We conducted a randomized cross-over trial that compared DAA-facilitated single-data abstraction plus verification (“DAA verification”), single data abstraction plus verification (“regular verification”), and independent dual data abstraction plus adjudication (“independent abstraction”). Study Design and Setting: This study is an online randomized cross-over trial with 26 pairs of data abstractors. Each pair abstracted data from six articles, two per approach. Outcomes were the proportion of errors and time taken. Results: Overall proportion of errors was 17% for DAA verification, 16% for regular verification, and 15% for independent abstraction. DAA verification was associated with higher odds of errors when compared with regular verification (adjusted odds ratio [OR] = 1.08; 95% confidence interval [CI]: 0.99–1.17) or independent abstraction (adjusted OR = 1.12; 95% CI: 1.03–1.22). For each article, DAA verification took 20 minutes (95% CI: 1–40) longer than regular verification, but 46 minutes (95% CI: 26 to 66) shorter than independent abstraction. Conclusion: Independent abstraction may only be necessary for complex data items. DAA provides an audit trail that is crucial for reproducible research.

AB - Objectives: Data Abstraction Assistant (DAA) is a software for linking items abstracted into a data collection form for a systematic review to their locations in a study report. We conducted a randomized cross-over trial that compared DAA-facilitated single-data abstraction plus verification (“DAA verification”), single data abstraction plus verification (“regular verification”), and independent dual data abstraction plus adjudication (“independent abstraction”). Study Design and Setting: This study is an online randomized cross-over trial with 26 pairs of data abstractors. Each pair abstracted data from six articles, two per approach. Outcomes were the proportion of errors and time taken. Results: Overall proportion of errors was 17% for DAA verification, 16% for regular verification, and 15% for independent abstraction. DAA verification was associated with higher odds of errors when compared with regular verification (adjusted odds ratio [OR] = 1.08; 95% confidence interval [CI]: 0.99–1.17) or independent abstraction (adjusted OR = 1.12; 95% CI: 1.03–1.22). For each article, DAA verification took 20 minutes (95% CI: 1–40) longer than regular verification, but 46 minutes (95% CI: 26 to 66) shorter than independent abstraction. Conclusion: Independent abstraction may only be necessary for complex data items. DAA provides an audit trail that is crucial for reproducible research.

KW - Accuracy

KW - Data abstraction

KW - Efficiency

KW - Randomized cross-over trial

KW - Software application

KW - Systematic review

UR - http://www.scopus.com/inward/record.url?scp=85070113987&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85070113987&partnerID=8YFLogxK

U2 - 10.1016/j.jclinepi.2019.07.005

DO - 10.1016/j.jclinepi.2019.07.005

M3 - Article

VL - 115

SP - 77

EP - 89

JO - Journal of Clinical Epidemiology

JF - Journal of Clinical Epidemiology

SN - 0895-4356

ER -