5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      FAST: A Framework to Assess Speed of Translation of Health Innovations to Practice and Policy

      editorial

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The 17-year time span between discovery and application of evidence in practice has become a unifying challenge for implementation science and translational science more broadly. Further, global pandemics and social crises demand timely implementation of rapidly accruing evidence to reduce morbidity and mortality. Yet speed remains an understudied metric in implementation science. Prevailing evaluations of implementation lack a temporal aspect, and current approaches have not yielded rapid implementation. In this paper, we address speed as an important conceptual and methodological gap in implementation science. We aim to untangle the complexities of studying implementation speed, offer a framework to assess speed of translation (FAST), and provide guidance to measure speed in evaluating implementation. To facilitate specification and reporting on metrics of speed, we encourage consideration of stakeholder perspectives (e.g., comparison of varying priorities), referents (e.g., speed in attaining outcomes, transitioning between implementation phases), and observation windows (e.g., time from intervention development to first patient treated) in its measurement. The FAST framework identifies factors that may influence speed of implementation and potential effects of implementation speed. We propose a research agenda to advance understanding of the pace of implementation, including identifying accelerators and inhibitors to speed.

          Related collections

          Most cited references79

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

          Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

            Background Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Methods Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Results Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. Conclusions This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort. Electronic supplementary material The online version of this article (doi:10.1186/s13012-015-0209-1) contains supplementary material, which is available to authorized users.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact.

              This study proposes methods for blending design components of clinical effectiveness and implementation research. Such blending can provide benefits over pursuing these lines of research independently; for example, more rapid translational gains, more effective implementation strategies, and more useful information for decision makers. This study proposes a "hybrid effectiveness-implementation" typology, describes a rationale for their use, outlines the design decisions that must be faced, and provides several real-world examples. An effectiveness-implementation hybrid design is one that takes a dual focus a priori in assessing clinical effectiveness and implementation. We propose 3 hybrid types: (1) testing effects of a clinical intervention on relevant outcomes while observing and gathering information on implementation; (2) dual testing of clinical and implementation interventions/strategies; and (3) testing of an implementation strategy while observing and gathering information on the clinical intervention's impact on relevant outcomes. The hybrid typology proposed herein must be considered a construct still in evolution. Although traditional clinical effectiveness and implementation trials are likely to remain the most common approach to moving a clinical intervention through from efficacy research to public health impact, judicious use of the proposed hybrid designs could speed the translation of research findings into routine practice.
                Bookmark

                Author and article information

                Contributors
                aramsey@wustl.edu
                Journal
                Glob Implement Res Appl
                Glob Implement Res Appl
                Global Implementation Research and Applications
                Springer International Publishing (Cham )
                2662-9275
                2 June 2022
                : 1-13
                Affiliations
                [1 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Brown School, , Washington University in St. Louis, ; St. Louis, MO 63130 USA
                [2 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Department of Psychiatry, , Washington University School of Medicine, ; St. Louis, MO 63110 USA
                [3 ]GRID grid.410354.7, ISNI 0000 0001 0244 9440, Oregon Social Learning Center, ; Eugene, OR 97401 USA
                [4 ]GRID grid.262962.b, ISNI 0000 0004 1936 9342, Healthcare Innovation Lab, , BJC HealthCare/Washington University School of Medicine, ; St. Louis, MO 63110 USA
                [5 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Division of Cardiology, , Washington University School of Medicine, ; St. Louis, MO 63110 USA
                [6 ]GRID grid.48336.3a, ISNI 0000 0004 1936 8075, Division of Cancer Control and Population Sciences, , National Cancer Institute, NIH, ; Bethesda, MD 20892 USA
                [7 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Prevention Research Center, Brown School, , Washington University in St. Louis, ; St. Louis, MO 63130 USA
                [8 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Division of Public Health Sciences, Department of Surgery, , Washington University School of Medicine, ; St. Louis, MO 63110 USA
                [9 ]GRID grid.4367.6, ISNI 0000 0001 2355 7002, Alvin J. Siteman Cancer Center, , Washington University School of Medicine, ; St. Louis, MO 63110 USA
                Author information
                http://orcid.org/0000-0002-3471-3725
                Article
                45
                10.1007/s43477-022-00045-4
                9161655
                35669171
                a8ebb4fc-504b-47ca-a5f9-daeefb016346
                © The Author(s), under exclusive licence to Springer Nature Switzerland AG 2022

                This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.

                History
                : 4 March 2022
                : 14 May 2022
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/100006108, National Center for Advancing Translational Sciences;
                Award ID: UL1TR002345
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000025, National Institute of Mental Health;
                Award ID: R25MH080916
                Award ID: P50MH113662
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000054, National Cancer Institute;
                Award ID: P50CA244431
                Award ID: P50CA244431
                Award ID: P50CA244431
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100006093, Patient-Centered Outcomes Research Institute;
                Award ID: TRD-1511-33321
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000026, National Institute on Drug Abuse;
                Award ID: K12DA041449
                Award ID: R34DA052928
                Award ID: R01DA044745
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000062, National Institute of Diabetes and Digestive and Kidney Diseases;
                Award ID: P30DK092950
                Award ID: P30DK056341
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100000030, Centers for Disease Control and Prevention;
                Award ID: U48DP006395
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100007338, Foundation for Barnes-Jewish Hospital;
                Categories
                Commentary

                implementation science,translational science,speed,rapid cycle research,metrics

                Comments

                Comment on this article