16
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Science of Learning Health Systems: Scoping Review of Empirical Research

      review-article

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The development and adoption of a learning health system (LHS) has been proposed as a means to address key challenges facing current and future health care systems. The first review of the LHS literature was conducted 5 years ago, identifying only a small number of published papers that had empirically examined the implementation or testing of an LHS. It is timely to look more closely at the published empirical research and to ask the question, Where are we now? 5 years on from that early LHS review.

          Objective

          This study performed a scoping review of empirical research within the LHS domain. Taking an “implementation science” lens, the review aims to map out the empirical research that has been conducted to date, identify limitations, and identify future directions for the field.

          Methods

          Two academic databases (PubMed and Scopus) were searched using the terms “learning health* system*” for papers published between January 1, 2016, to January 31, 2021, that had an explicit empirical focus on LHSs. Study information was extracted relevant to the review objective, including each study’s publication details; primary concern or focus; context; design; data type; implementation framework, model, or theory used; and implementation determinants or outcomes examined.

          Results

          A total of 76 studies were included in this review. Over two-thirds of the studies were concerned with implementing a particular program, system, or platform (53/76, 69.7%) designed to contribute to achieving an LHS. Most of these studies focused on a particular clinical context or patient population (37/53, 69.8%), with far fewer studies focusing on whole hospital systems (4/53, 7.5%) or on other broad health care systems encompassing multiple facilities (12/53, 22.6%). Over two-thirds of the program-specific studies utilized quantitative methods (37/53, 69.8%), with a smaller number utilizing qualitative methods (10/53, 18.9%) or mixed-methods designs (6/53, 11.3%). The remaining 23 studies were classified into 1 of 3 key areas: ethics, policies, and governance (10/76, 13.2%); stakeholder perspectives of LHSs (5/76, 6.6%); or LHS-specific research strategies and tools (8/76, 10.5%). Overall, relatively few studies were identified that incorporated an implementation science framework.

          Conclusions

          Although there has been considerable growth in empirical applications of LHSs within the past 5 years, paralleling the recent emergence of LHS-specific research strategies and tools, there are few high-quality studies. Comprehensive reporting of implementation and evaluation efforts is an important step to moving the LHS field forward. In particular, the routine use of implementation determinant and outcome frameworks will improve the assessment and reporting of barriers, enablers, and implementation outcomes in this field and will enable comparison and identification of trends across studies.

          Related collections

          Most cited references88

          • Record: found
          • Abstract: found
          • Article: not found

          PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation

          Scoping reviews, a type of knowledge synthesis, follow a systematic approach to map evidence on a topic and identify main concepts, theories, sources, and knowledge gaps. Although more scoping reviews are being done, their methodological and reporting quality need improvement. This document presents the PRISMA-ScR (Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews) checklist and explanation. The checklist was developed by a 24-member expert panel and 2 research leads following published guidance from the EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network. The final checklist contains 20 essential reporting items and 2 optional items. The authors provide a rationale and an example of good reporting for each item. The intent of the PRISMA-ScR is to help readers (including researchers, publishers, commissioners, policymakers, health care providers, guideline developers, and patients or consumers) develop a greater understanding of relevant terminology, core concepts, and key items to report for scoping reviews.
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

            Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR) that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality), four constructs were identified related to outer setting (e.g., patient needs and resources), 12 constructs were identified related to inner setting (e.g., culture, leadership engagement), five constructs were identified related to individual characteristics, and eight constructs were identified related to process (e.g., plan, evaluate, and reflect). We present explicit definitions for each construct. Conclusion The CFIR provides a pragmatic structure for approaching complex, interacting, multi-level, and transient states of constructs in the real world by embracing, consolidating, and unifying key constructs from published implementation theories. It can be used to guide formative evaluations and build the implementation knowledge base across multiple studies and settings.
              • Record: found
              • Abstract: found
              • Article: not found

              Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda

              An unresolved issue in the field of implementation research is how to conceptualize and evaluate successful implementation. This paper advances the concept of “implementation outcomes” distinct from service system and clinical treatment outcomes. This paper proposes a heuristic, working “taxonomy” of eight conceptually distinct implementation outcomes—acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability—along with their nominal definitions. We propose a two-pronged agenda for research on implementation outcomes. Conceptualizing and measuring implementation outcomes will advance understanding of implementation processes, enhance efficiency in implementation research, and pave the way for studies of the comparative effectiveness of implementation strategies.

                Author and article information

                Contributors
                Journal
                JMIR Med Inform
                JMIR Med Inform
                JMI
                JMIR Medical Informatics
                JMIR Publications (Toronto, Canada )
                2291-9694
                February 2022
                23 February 2022
                : 10
                : 2
                : e34907
                Affiliations
                [1 ] Australian Institute of Health Innovation Macquarie University Sydney Australia
                Author notes
                Corresponding Author: Louise A Ellis louise.ellis@ 123456mq.edu.au
                Author information
                https://orcid.org/0000-0001-6902-4578
                https://orcid.org/0000-0001-7318-3598
                https://orcid.org/0000-0002-9923-3116
                https://orcid.org/0000-0002-8188-712X
                https://orcid.org/0000-0002-9083-7845
                https://orcid.org/0000-0003-4377-5490
                https://orcid.org/0000-0002-9118-7207
                https://orcid.org/0000-0003-3331-8093
                https://orcid.org/0000-0001-7744-8717
                https://orcid.org/0000-0003-0296-4957
                Article
                v10i2e34907
                10.2196/34907
                8908194
                35195529
                24e9d393-4b97-41f1-b865-5351993e095d
                ©Louise A Ellis, Mitchell Sarkies, Kate Churruca, Genevieve Dammery, Isabelle Meulenbroeks, Carolynn L Smith, Chiara Pomare, Zeyad Mahmoud, Yvonne Zurynski, Jeffrey Braithwaite. Originally published in JMIR Medical Informatics (https://medinform.jmir.org), 23.02.2022.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Informatics, is properly cited. The complete bibliographic information, a link to the original publication on https://medinform.jmir.org/, as well as this copyright and license information must be included.

                History
                : 12 November 2021
                : 3 December 2021
                : 7 December 2021
                : 2 January 2022
                Categories
                Review
                Review

                learning health systems,learning health care systems,implementation science,evaluation,health system,health care system,empirical research,medical informatics,review

                Comments

                Comment on this article

                Related Documents Log