A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews

Tianjing Li, Ian J. Saldanha, Jens Jap, Bryant T. Smith, Joseph Canner, Susan M. Hutfless, Vernal Branch, Simona Carini, Wiley Chan, Berry de Bruijn, Byron C. Wallace, Sandra A. Walsh, Elizabeth J. Whamond, Mohammad H Murad, Ida Sim, Jesse A. Berlin, Joseph Lau, Kay Dickersin, Christopher H. Schmid

Research output: Contribution to journalArticle

2 Scopus citations

Abstract

Objectives: Data Abstraction Assistant (DAA) is a software for linking items abstracted into a data collection form for a systematic review to their locations in a study report. We conducted a randomized cross-over trial that compared DAA-facilitated single-data abstraction plus verification (“DAA verification”), single data abstraction plus verification (“regular verification”), and independent dual data abstraction plus adjudication (“independent abstraction”). Study Design and Setting: This study is an online randomized cross-over trial with 26 pairs of data abstractors. Each pair abstracted data from six articles, two per approach. Outcomes were the proportion of errors and time taken. Results: Overall proportion of errors was 17% for DAA verification, 16% for regular verification, and 15% for independent abstraction. DAA verification was associated with higher odds of errors when compared with regular verification (adjusted odds ratio [OR] = 1.08; 95% confidence interval [CI]: 0.99–1.17) or independent abstraction (adjusted OR = 1.12; 95% CI: 1.03–1.22). For each article, DAA verification took 20 minutes (95% CI: 1–40) longer than regular verification, but 46 minutes (95% CI: 26 to 66) shorter than independent abstraction. Conclusion: Independent abstraction may only be necessary for complex data items. DAA provides an audit trail that is crucial for reproducible research.

Original languageEnglish (US)
Pages (from-to)77-89
Number of pages13
JournalJournal of Clinical Epidemiology
Volume115
DOIs
StatePublished - Nov 1 2019

    Fingerprint

Keywords

  • Accuracy
  • Data abstraction
  • Efficiency
  • Randomized cross-over trial
  • Software application
  • Systematic review

ASJC Scopus subject areas

  • Epidemiology

Cite this

Li, T., Saldanha, I. J., Jap, J., Smith, B. T., Canner, J., Hutfless, S. M., Branch, V., Carini, S., Chan, W., de Bruijn, B., Wallace, B. C., Walsh, S. A., Whamond, E. J., Murad, M. H., Sim, I., Berlin, J. A., Lau, J., Dickersin, K., & Schmid, C. H. (2019). A randomized trial provided new evidence on the accuracy and efficiency of traditional vs. electronically annotated abstraction approaches in systematic reviews. Journal of Clinical Epidemiology, 115, 77-89. https://doi.org/10.1016/j.jclinepi.2019.07.005