8
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Software for Administering the National Cancer Institute’s Patient-Reported Outcomes Version of the Common Terminology Criteria for Adverse Events: Usability Study

      research-article
      , MD, MPH 1 , , MD, MSc 2 , 3 , 4 , 5 , , , PhD 6 , , MD, MHA, MMCi 4 , 7 , 8 , , PhD 9 , , CRNP, PhD 10 , , RN, MS 11 , , MS 12 , , MS 3 , , MA 3 , , MS 3 , , BA 13 , , PhD 14 , , MHA 15 , , PhD 16 , , PhD 9 , , MD, MPH 17 , , PhD 18 , , MD, PhD 6 , 14 , 19
      (Reviewer), (Reviewer)
      JMIR Human Factors
      JMIR Publications
      usability, patient-reported outcomes, symptoms, adverse events, PRO-CTCAE, cancer clinical trials

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The US National Cancer Institute (NCI) developed software to gather symptomatic adverse events directly from patients participating in clinical trials. The software administers surveys to patients using items from the Patient-Reported Outcomes version of the Common Terminology Criteria for Adverse Events (PRO-CTCAE) through Web-based or automated telephone interfaces and facilitates the management of survey administration and the resultant data by professionals (clinicians and research associates).

          Objective

          The purpose of this study was to iteratively evaluate and improve the usability of the PRO-CTCAE software.

          Methods

          Heuristic evaluation of the software functionality was followed by semiscripted, think-aloud protocols in two consecutive rounds of usability testing among patients with cancer, clinicians, and research associates at 3 cancer centers. We conducted testing with patients both in clinics and at home (remotely) for both Web-based and telephone interfaces. Furthermore, we refined the software between rounds and retested.

          Results

          Heuristic evaluation identified deviations from the best practices across 10 standardized categories, which informed initial software improvement. Subsequently, we conducted user-based testing among 169 patients and 47 professionals. Software modifications between rounds addressed identified issues, including difficulty using radio buttons, absence of survey progress indicators, and login problems (for patients) as well as scheduling of patient surveys (for professionals). The initial System Usability Scale (SUS) score for the patient Web-based interface was 86 and 82 ( P=.22) before and after modifications, respectively, whereas the task completion score was 4.47, which improved to 4.58 ( P=.39) after modifications. Following modifications for professional users, the SUS scores improved from 71 to 75 ( P=.47), and the mean task performance improved significantly (4.40 vs 4.02; P=.001).

          Conclusions

          Software modifications, informed by rigorous assessment, rendered a usable system, which is currently used in multiple NCI-sponsored multicenter cancer clinical trials.

          Trial Registration

          ClinicalTrials.gov NCT01031641; https://clinicaltrials.gov/ct2/show/NCT01031641 (Archived by WebCite at http://www.webcitation.org/708hTjlTl)

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: not found

          Enhancing patient safety and quality of care by improving the usability of electronic health record systems: recommendations from AMIA.

          In response to mounting evidence that use of electronic medical record systems may cause unintended consequences, and even patient harm, the AMIA Board of Directors convened a Task Force on Usability to examine evidence from the literature and make recommendations. This task force was composed of representatives from both academic settings and vendors of electronic health record (EHR) systems. After a careful review of the literature and of vendor experiences with EHR design and implementation, the task force developed 10 recommendations in four areas: (1) human factors health information technology (IT) research, (2) health IT policy, (3) industry recommendations, and (4) recommendations for the clinician end-user of EHR software. These AMIA recommendations are intended to stimulate informed debate, provide a plan to increase understanding of the impact of usability on the effective use of health IT, and lead to safer and higher quality care with the adoption of useful and usable EHR systems.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Review of health information technology usability study methodologies

            Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A literature synthesis of symptom prevalence and severity in persons receiving active cancer treatment.

              Patients with cancer experience acute and chronic symptoms caused by their underlying disease or by the treatment. While numerous studies have examined the impact of various treatments on symptoms experienced by cancer patients, there are inconsistencies regarding the symptoms measured and reported in treatment trials. This article presents a systematic review of the research literature of the prevalence and severity of symptoms in patients undergoing cancer treatment.
                Bookmark

                Author and article information

                Contributors
                Journal
                JMIR Hum Factors
                JMIR Hum Factors
                JMIR Human Factors
                JMIR Human Factors
                JMIR Publications (Toronto, Canada )
                2292-9495
                Jul-Sep 2018
                16 July 2018
                : 5
                : 3
                : e10070
                Affiliations
                [01] 1 Division of Hematology and Medical Oncology Department of Internal Medicine Saint Louis University School of Medicine Saint Louis, MO United States
                [02] 2 Division of Hematology/Oncology Department of Medicine University of North Carolina School of Medicine Chapel Hill, NC United States
                [03] 3 Department of Epidemiology and Biostatistics Memorial Sloan Kettering Cancer Center New York, NY United States
                [04] 4 Lineberger Comprehensive Cancer Center University of North Carolina Chapel Hill, NC United States
                [05] 5 Department of Health Policy and Management Gillings School of Public Health University of North Carolina Chapel Hill, NC United States
                [06] 6 Duke Clinical Research Institute Duke University Durham, NC United States
                [07] 7 Division of General Medicine and Clinical Epidemiology Department of Medicine University of North Carolina School of Medicine Chapel Hill, NC United States
                [08] 8 Division of General Pediatrics & Adolescent Medicine Department of Pediatrics, Program on Health & Clinical Informatics University of North Carolina School of Medicine Chapel Hill, NC United States
                [09] 9 Department of Symptom Research The University of Texas MD Anderson Cancer Center Houston, TX United States
                [10] 10 Division of Cancer Control and Population Sciences National Cancer Institute Rockville, MD United States
                [11] 11 Division of Cancer Prevention National Cancer Institute Rockville, MD United States
                [12] 12 SemanticBits Herndon, VA United States
                [13] 13 Center for Biomedical Informatics and Information Technology National Cancer Institute Rockville, MD United States
                [14] 14 Duke Cancer Institute Durham, NC United States
                [15] 15 FHI 360 Durham, NC United States
                [16] 16 Division of General Internal Medicine Duke University School of Medicine Durham, NC United States
                [17] 17 Division of Population Sciences Dana-Farber Cancer Institute Boston, MA United States
                [18] 18 Alliance Statistics and Data Center Mayo Clinic Scottsdale, AZ United States
                [19] 19 Flatiron Health New York, NY United States
                Author notes
                Corresponding Author: Ethan Basch ebasch@ 123456med.unc.edu
                Author information
                http://orcid.org/0000-0001-6388-5553
                http://orcid.org/0000-0003-3813-9318
                http://orcid.org/0000-0002-2439-7351
                http://orcid.org/0000-0002-4821-0256
                http://orcid.org/0000-0001-8122-5233
                http://orcid.org/0000-0002-4153-9972
                http://orcid.org/0000-0003-0399-5579
                http://orcid.org/0000-0002-1188-9648
                http://orcid.org/0000-0002-4950-1434
                http://orcid.org/0000-0002-6415-1575
                http://orcid.org/0000-0003-0166-651X
                http://orcid.org/0000-0002-1456-3600
                http://orcid.org/0000-0002-6709-8714
                http://orcid.org/0000-0003-4387-4800
                http://orcid.org/0000-0002-9937-5560
                http://orcid.org/0000-0002-1460-6527
                http://orcid.org/0000-0002-4334-5717
                http://orcid.org/0000-0002-9912-1085
                http://orcid.org/0000-0001-6930-8722
                Article
                v5i3e10070
                10.2196/10070
                6066634
                30012546
                8076c3dd-0109-4c06-b93a-ad856c90d340
                ©Martin W Schoen, Ethan Basch, Lori L Hudson, Arlene E Chung, Tito R Mendoza, Sandra A Mitchell, Diane St. Germain, Paul Baumgartner, Laura Sit, Lauren J Rogak, Marwan Shouery, Eve Shalley, Bryce B Reeve, Maria R Fawzy, Nrupen A Bhavsar, Charles Cleeland, Deborah Schrag, Amylou C Dueck, Amy P Abernethy. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.07.2018.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Human Factors, is properly cited. The complete bibliographic information, a link to the original publication on http://humanfactors.jmir.org.as well as this copyright and license information must be included.

                History
                : 12 February 2018
                : 29 March 2018
                : 26 April 2018
                : 8 May 2018
                Categories
                Original Paper
                Original Paper

                usability,patient-reported outcomes,symptoms,adverse events,pro-ctcae,cancer clinical trials

                Comments

                Comment on this article