Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events: Usability study

Martin W. Schoen, Ethan Basch, Lori L. Hudson, Arlene E. Chung, Tito R. Mendoza, Sandra A. Mitchell, Diane St Germain, Paul Baumgartner, Laura Sit, Lauren J. Rogak, Marwan Shouery, Eve Shalley, Bryce B. Reeve, Maria R. Fawzy, Nrupen A. Bhavsar, Charles Cleeland, Deborah Schrag, Amylou Dueck, Amy P. Abernethy

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.

Original languageEnglish (US)
Article numbere10070
JournalJournal of Medical Internet Research
Volume20
Issue number7
DOIs
StatePublished - Jul 1 2018

Fingerprint

National Cancer Institute (U.S.)
Terminology
Software
Telephone
Clinical Trials
Neoplasms
Patient Reported Outcome Measures
Task Performance and Analysis
Radio
Practice Guidelines
Research
Surveys and Questionnaires

Keywords

  • Adverse events
  • Cancer clinical trials
  • Patient-reported outcomes
  • PRO-CTCAE
  • Symptoms
  • Usability

ASJC Scopus subject areas

  • Health Informatics

Cite this

Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events : Usability study. / Schoen, Martin W.; Basch, Ethan; Hudson, Lori L.; Chung, Arlene E.; Mendoza, Tito R.; Mitchell, Sandra A.; Germain, Diane St; Baumgartner, Paul; Sit, Laura; Rogak, Lauren J.; Shouery, Marwan; Shalley, Eve; Reeve, Bryce B.; Fawzy, Maria R.; Bhavsar, Nrupen A.; Cleeland, Charles; Schrag, Deborah; Dueck, Amylou; Abernethy, Amy P.

In: Journal of Medical Internet Research, Vol. 20, No. 7, e10070, 01.07.2018.

Research output: Contribution to journalArticle

Schoen, MW, Basch, E, Hudson, LL, Chung, AE, Mendoza, TR, Mitchell, SA, Germain, DS, Baumgartner, P, Sit, L, Rogak, LJ, Shouery, M, Shalley, E, Reeve, BB, Fawzy, MR, Bhavsar, NA, Cleeland, C, Schrag, D, Dueck, A & Abernethy, AP 2018, 'Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events: Usability study', Journal of Medical Internet Research, vol. 20, no. 7, e10070. https://doi.org/10.2196/10070
Schoen, Martin W. ; Basch, Ethan ; Hudson, Lori L. ; Chung, Arlene E. ; Mendoza, Tito R. ; Mitchell, Sandra A. ; Germain, Diane St ; Baumgartner, Paul ; Sit, Laura ; Rogak, Lauren J. ; Shouery, Marwan ; Shalley, Eve ; Reeve, Bryce B. ; Fawzy, Maria R. ; Bhavsar, Nrupen A. ; Cleeland, Charles ; Schrag, Deborah ; Dueck, Amylou ; Abernethy, Amy P. / Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events : Usability study. In: Journal of Medical Internet Research. 2018 ; Vol. 20, No. 7.
@article{04b0191d7f454b45a2f698cfde188590,
title = "Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events: Usability study",
abstract = "Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.",
keywords = "Adverse events, Cancer clinical trials, Patient-reported outcomes, PRO-CTCAE, Symptoms, Usability",
author = "Schoen, {Martin W.} and Ethan Basch and Hudson, {Lori L.} and Chung, {Arlene E.} and Mendoza, {Tito R.} and Mitchell, {Sandra A.} and Germain, {Diane St} and Paul Baumgartner and Laura Sit and Rogak, {Lauren J.} and Marwan Shouery and Eve Shalley and Reeve, {Bryce B.} and Fawzy, {Maria R.} and Bhavsar, {Nrupen A.} and Charles Cleeland and Deborah Schrag and Amylou Dueck and Abernethy, {Amy P.}",
year = "2018",
month = "7",
day = "1",
doi = "10.2196/10070",
language = "English (US)",
volume = "20",
journal = "Journal of Medical Internet Research",
issn = "1439-4456",
publisher = "Journal of medical Internet Research",
number = "7",

}

TY - JOUR

T1 - Software for administering the national cancer institute⇔s patient-reported outcomes version of the common terminology criteria for adverse events

T2 - Usability study

AU - Schoen, Martin W.

AU - Basch, Ethan

AU - Hudson, Lori L.

AU - Chung, Arlene E.

AU - Mendoza, Tito R.

AU - Mitchell, Sandra A.

AU - Germain, Diane St

AU - Baumgartner, Paul

AU - Sit, Laura

AU - Rogak, Lauren J.

AU - Shouery, Marwan

AU - Shalley, Eve

AU - Reeve, Bryce B.

AU - Fawzy, Maria R.

AU - Bhavsar, Nrupen A.

AU - Cleeland, Charles

AU - Schrag, Deborah

AU - Dueck, Amylou

AU - Abernethy, Amy P.

PY - 2018/7/1

Y1 - 2018/7/1

N2 - Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.

AB - Background: The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates). Objective: The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software. Methods: Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested. Results: Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 (P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 (P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 (P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001). Conclusions: Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.

KW - Adverse events

KW - Cancer clinical trials

KW - Patient-reported outcomes

KW - PRO-CTCAE

KW - Symptoms

KW - Usability

UR - http://www.scopus.com/inward/record.url?scp=85052022525&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85052022525&partnerID=8YFLogxK

U2 - 10.2196/10070

DO - 10.2196/10070

M3 - Article

AN - SCOPUS:85052022525

VL - 20

JO - Journal of Medical Internet Research

JF - Journal of Medical Internet Research

SN - 1439-4456

IS - 7

M1 - e10070

ER -