7
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Cognitive Walkthrough for Implementation Strategies (CWIS): a pragmatic method for assessing implementation strategy usability

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Implementation strategies have flourished in an effort to increase integration of research evidence into clinical practice. Most strategies are complex, socially mediated processes. Many are complicated, expensive, and ultimately impractical to deliver in real-world settings. The field lacks methods to assess the extent to which strategies are usable and aligned with the needs and constraints of the individuals and contexts who will deliver or receive them. Drawn from the field of human-centered design, cognitive walkthroughs are an efficient assessment method with potential to identify aspects of strategies that may inhibit their usability and, ultimately, effectiveness. This article presents a novel walkthrough methodology for evaluating strategy usability as well as an example application to a post-training consultation strategy to support school mental health clinicians to adopt measurement-based care.

          Method

          The Cognitive Walkthrough for Implementation Strategies (CWIS) is a pragmatic, mixed-methods approach for evaluating complex, socially mediated implementation strategies. CWIS includes six steps: (1) determine preconditions; (2) hierarchical task analysis; (3) task prioritization; (4) convert tasks to scenarios; (5) pragmatic group testing; and (6) usability issue identification, classification, and prioritization. A facilitator conducted two group testing sessions with clinician users ( N = 10), guiding participants through 6 scenarios and 11 associated subtasks. Clinicians reported their anticipated likelihood of completing each subtask and provided qualitative justifications during group discussion. Following the walkthrough sessions, users completed an adapted quantitative assessment of strategy usability.

          Results

          Average anticipated success ratings indicated substantial variability across participants and subtasks. Usability ratings (scale 0–100) of the consultation protocol averaged 71.3 ( SD = 10.6). Twenty-one usability problems were identified via qualitative content analysis with consensus coding, and classified by severity and problem type. High-severity problems included potential misalignment between consultation and clinical service timelines as well as digressions during consultation processes.

          Conclusions

          CWIS quantitative usability ratings indicated that the consultation protocol was at the low end of the “acceptable” range (based on norms from the unadapted scale). Collectively, the 21 resulting usability issues explained the quantitative usability data and provided specific direction for usability enhancements. The current study provides preliminary evidence for the utility of CWIS to assess strategy usability and generate a blueprint for redesign.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s43058-021-00183-0.

          Related collections

          Most cited references76

          • Record: found
          • Abstract: found
          • Article: not found

          Three approaches to qualitative content analysis.

          Content analysis is a widely used qualitative research technique. Rather than being a single method, current applications of content analysis show three distinct approaches: conventional, directed, or summative. All three approaches are used to interpret meaning from the content of text data and, hence, adhere to the naturalistic paradigm. The major differences among the approaches are coding schemes, origins of codes, and threats to trustworthiness. In conventional content analysis, coding categories are derived directly from the text data. With a directed approach, analysis starts with a theory or relevant research findings as guidance for initial codes. A summative content analysis involves counting and comparisons, usually of keywords or content, followed by the interpretation of the underlying context. The authors delineate analytic procedures specific to each approach and techniques addressing trustworthiness with hypothetical examples drawn from the area of end-of-life care.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

            Background Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Methods Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Results Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. Conclusions This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort. Electronic supplementary material The online version of this article (doi:10.1186/s13012-015-0209-1) contains supplementary material, which is available to authorized users.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Implementation strategies: recommendations for specifying and reporting

              Implementation strategies have unparalleled importance in implementation science, as they constitute the ‘how to’ component of changing healthcare practice. Yet, implementation researchers and other stakeholders are not able to fully utilize the findings of studies focusing on implementation strategies because they are often inconsistently labelled and poorly described, are rarely justified theoretically, lack operational definitions or manuals to guide their use, and are part of ‘packaged’ approaches whose specific elements are poorly understood. We address the challenges of specifying and reporting implementation strategies encountered by researchers who design, conduct, and report research on implementation strategies. Specifically, we propose guidelines for naming, defining, and operationalizing implementation strategies in terms of seven dimensions: actor, the action, action targets, temporality, dose, implementation outcomes addressed, and theoretical justification. Ultimately, implementation strategies cannot be used in practice or tested in research without a full description of their components and how they should be used. As with all intervention research, their descriptions must be precise enough to enable measurement and ‘reproducibility.’ We propose these recommendations to improve the reporting of implementation strategies in research studies and to stimulate further identification of elements pertinent to implementation strategies that should be included in reporting guidelines for implementation strategies.
                Bookmark

                Author and article information

                Contributors
                lyona@uw.edu
                jcoifman@uw.edu
                hcookmed@uw.edu
                emcree@uw.edu
                fredaliu@uw.edu
                ludwik01@uw.edu
                dorsey2@uw.edu
                Kelly.Koerner@jasprhealth.com
                smunson@uw.edu
                eliz@uw.edu
                Journal
                Implement Sci Commun
                Implement Sci Commun
                Implementation Science Communications
                BioMed Central (London )
                2662-2211
                17 July 2021
                17 July 2021
                2021
                : 2
                : 78
                Affiliations
                [1 ]GRID grid.34477.33, ISNI 0000000122986657, Department of Psychiatry and Behavioral Sciences, , University of Washington, ; 6200 NE 74th Street, Suite 100, Seattle, WA 98115 USA
                [2 ]GRID grid.34477.33, ISNI 0000000122986657, Department of Psychology, , University of Washington, ; 119A Guthrie Hall, Seattle, WA 98195 USA
                [3 ]Evidence Based Practice Institute, Inc., 929 K Street, Washougal, WA 98671 USA
                [4 ]GRID grid.34477.33, ISNI 0000000122986657, Department of Human Centered Design and Engineering, , University of Washington, ; 428 Sieg Building, Seattle, WA 98195 USA
                Author information
                http://orcid.org/0000-0003-3657-5060
                Article
                183
                10.1186/s43058-021-00183-0
                8285864
                34274027
                0f618852-6e68-4943-a575-068eb88bba32
                © The Author(s) 2021

                Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 18 February 2021
                : 7 July 2021
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/100000025, National Institute of Mental Health;
                Award ID: R34MH109605
                Award ID: P50MH115837
                Award Recipient :
                Categories
                Methodology
                Custom metadata
                © The Author(s) 2021

                implementation strategies,human-centered design,usability,cognitive walkthrough

                Comments

                Comment on this article