133
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      An introduction to implementation science for the non-specialist

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The movement of evidence-based practices (EBPs) into routine clinical usage is not spontaneous, but requires focused efforts. The field of implementation science has developed to facilitate the spread of EBPs, including both psychosocial and medical interventions for mental and physical health concerns.

          Discussion

          The authors aim to introduce implementation science principles to non-specialist investigators, administrators, and policymakers seeking to become familiar with this emerging field. This introduction is based on published literature and the authors’ experience as researchers in the field, as well as extensive service as implementation science grant reviewers.

          Implementation science is “the scientific study of methods to promote the systematic uptake of research findings and other EBPs into routine practice, and, hence, to improve the quality and effectiveness of health services.” Implementation science is distinct from, but shares characteristics with, both quality improvement and dissemination methods. Implementation studies can be either assess naturalistic variability or measure change in response to planned intervention. Implementation studies typically employ mixed quantitative-qualitative designs, identifying factors that impact uptake across multiple levels, including patient, provider, clinic, facility, organization, and often the broader community and policy environment. Accordingly, implementation science requires a solid grounding in theory and the involvement of trans-disciplinary research teams.

          Summary

          The business case for implementation science is clear: As healthcare systems work under increasingly dynamic and resource-constrained conditions, evidence-based strategies are essential in order to ensure that research investments maximize healthcare value and improve public health. Implementation science plays a critical role in supporting these efforts.

          Related collections

          Most cited references56

          • Record: found
          • Abstract: found
          • Article: not found

          Mortality results from a randomized prostate-cancer screening trial.

          The effect of screening with prostate-specific-antigen (PSA) testing and digital rectal examination on the rate of death from prostate cancer is unknown. This is the first report from the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial on prostate-cancer mortality. From 1993 through 2001, we randomly assigned 76,693 men at 10 U.S. study centers to receive either annual screening (38,343 subjects) or usual care as the control (38,350 subjects). Men in the screening group were offered annual PSA testing for 6 years and digital rectal examination for 4 years. The subjects and health care providers received the results and decided on the type of follow-up evaluation. Usual care sometimes included screening, as some organizations have recommended. The numbers of all cancers and deaths and causes of death were ascertained. In the screening group, rates of compliance were 85% for PSA testing and 86% for digital rectal examination. Rates of screening in the control group increased from 40% in the first year to 52% in the sixth year for PSA testing and ranged from 41 to 46% for digital rectal examination. After 7 years of follow-up, the incidence of prostate cancer per 10,000 person-years was 116 (2820 cancers) in the screening group and 95 (2322 cancers) in the control group (rate ratio, 1.22; 95% confidence interval [CI], 1.16 to 1.29). The incidence of death per 10,000 person-years was 2.0 (50 deaths) in the screening group and 1.7 (44 deaths) in the control group (rate ratio, 1.13; 95% CI, 0.75 to 1.70). The data at 10 years were 67% complete and consistent with these overall findings. After 7 to 10 years of follow-up, the rate of death from prostate cancer was very low and did not differ significantly between the two study groups. (ClinicalTrials.gov number, NCT00002540.) 2009 Massachusetts Medical Society
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The PRECIS-2 tool: designing trials that are fit for purpose.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges

              Background The PARiHS framework (Promoting Action on Research Implementation in Health Services) has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified. Discussion This being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis) used to distinguish between the terms conceptual framework, theory and model – important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation. Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative) measure of the elements and sub-elements of evidence (E) and context (C), and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place. In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally. Summary The paper concludes by suggesting that the future direction of the work on the PARiHS framework is to develop a two-stage diagnostic and evaluative approach, where the intervention is shaped and moulded by the information gathered about the specific situation and from participating stakeholders. In order to expedite the generation of new evidence and testing of emerging theories, we suggest the formation of an international research implementation science collaborative that can systematically collect and analyse experiences of using and testing the PARiHS framework and similar conceptual and theoretical approaches. We also recommend further refinement of the definitions around conceptual framework, theory, and model, suggesting a wider discussion that embraces multiple epistemological and ontological perspectives.
                Bookmark

                Author and article information

                Contributors
                857-364-6380 , mark.bauer@va.gov
                Laura.damschroder@va.gov
                Hildi.hagedorn@va.gov
                Jeffrey.smith6@va.gov
                Amy.kilbourne@va.gov
                Journal
                BMC Psychol
                BMC Psychol
                BMC Psychology
                BioMed Central (London )
                2050-7283
                16 September 2015
                16 September 2015
                2015
                : 3
                : 1
                : 32
                Affiliations
                [ ]Center for Healthcare Organization and Implementation Research, VA Boston Healthcare System and Department of Psychiatry, Harvard Medical School, Boston, MA USA
                [ ]Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, MI USA
                [ ]Substance Use Disorder Quality Enhancement Research Initiative and Center for Chronic Disease Outcomes Research, Minneapolis VA Health Care System & Department of Psychiatry, University of Minnesota School of Medicine, Minneapolis, MN USA
                [ ]Mental Health Quality Enhancement Research Initiative, Central Arkansas Veterans Healthcare System, North Little Rock, AR USA
                [ ]VA Quality Enhancement Research Initiative (QUERI), VA Office of Research and Development, Washington, DC USA
                [ ]Department of Psychiatry, University of Michigan, Ann Arbor, MI USA
                [ ]VA Boston Healthcare System, 152M, 150 South Huntington Avenue, Jamaica Plain, MA 02130 USA
                Article
                89
                10.1186/s40359-015-0089-9
                4573926
                26376626
                9e56d6ae-2c9a-409e-8aab-8066cf42e7f6
                © Bauer et al. 2015

                Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

                History
                : 4 June 2015
                : 9 September 2015
                Categories
                Debate
                Custom metadata
                © The Author(s) 2015

                implementation,quality improvement,health services,outcome assessment,evidence-based practice,learning healthcare organization

                Comments

                Comment on this article