Software for administering the National Cancer Institute’s patient-reported outcomes version of the common terminology criteria for adverse events: Usability study

Martin W. Schoen, Ethan Basch, Lori L. Hudson, Arlene E. Chung, Tito R. Mendoza, Sandra A. Mitchell, Diane St. Germain, Paul Baumgartner, Laura Sit, Lauren J. Rogak, Marwan Shouery, Eve Shalley, Bryce B. Reeve, Maria R. Fawzy, Nrupen A. Bhavsar, Charles Cleeland, Deborah Schrag, Amylou C. Dueck, Amy P. Abernethy

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.

Original languageEnglish (US)
Article numbere10070
JournalJMIR Human Factors
Volume5
Issue number3
DOIs
StatePublished - Jul 1 2018

Keywords

  • Adverse events
  • Cancer clinical trials
  • PRO-CTCAE
  • Patient-reported outcomes
  • Symptoms
  • Usability

ASJC Scopus subject areas

  • Human Factors and Ergonomics
  • Health Informatics

Fingerprint

Dive into the research topics of 'Software for administering the National Cancer Institute’s patient-reported outcomes version of the common terminology criteria for adverse events: Usability study'. Together they form a unique fingerprint.

Cite this