3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A pilot study to assess the learning environment and use of reliability enhancing work practices in VHA cardiac catheterization laboratories

      brief-report

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction

          A learning health system (LHS) harnesses data and analytics to learn from clinical encounters to implement the best care with high reliability. The 81 Veterans Health Administration (VHA) cardiac catheterization laboratories (cath lab) are a model LHS. The quality and safety of coronary procedures are monitored and reported by the Clinical Assessment, Reporting and Tracking (CART) Program, which has identified variation in care across cath labs. This variation may be due to underappreciated aspects of LHSs, the learning environment and reliability enhancing work practices (REWPs). Learning environments are the educational approaches, context, and settings in which learning occurs. REWPs are the organizational practices found in high reliability organizations. High learning environments and use of REWPs are associated with improved outcomes. This study assessed the learning environments and use of REWPs in VHA cath labs to examine factors supportive of learning and high reliability.

          Methods

          In 2018, the learning organization survey‐27 and the REWP survey were administered to 732 cath lab staff. Factor analysis and linear models were computed. Unit‐level analyses and site ranking (high, low) were conducted on cath labs with >40% response rate using Bayesian methods.

          Results

          Surveys from 40% of cath lab staff (n = 294) at 84% of cath labs (n = 68) were included. Learning environment and REWP strengths across cath labs include the presence of training programs, openness to new ideas, and respectful interaction. Learning environment and REWP gaps include lack of structured knowledge transfer (eg, checklists) and low use of forums for improvement. Survey dimensions matched established factor structures and demonstrated high reliability (Cronbach's alpha >.76). Unit‐level analyses were conducted for 29 cath labs. One ranked as high and four as low learning environments.

          Conclusions

          This work demonstrates an approach to assess local learning environments and use of REWPs, providing insights for systems working to become a LHS.

          Related collections

          Most cited references25

          • Record: found
          • Abstract: not found
          • Article: not found

          The Application of Electronic Computers to Factor Analysis

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found

            Psychological Safety: The History, Renaissance, and Future of an Interpersonal Construct

            Psychological safety describes people’s perceptions of the consequences of taking interpersonal risks in a particular context such as a workplace. First explored by pioneering organizational scholars in the 1960s, psychological safety experienced a renaissance starting in the 1990s and continuing to the present. Organizational research has identified psychological safety as a critical factor in understanding phenomena such as voice, teamwork, team learning, and organizational learning. A growing body of conceptual and empirical work has focused on understanding the nature of psychological safety, identifying factors that contribute to it, and examining its implications for individuals, teams, and organizations. In this article, we review and integrate this literature and suggest directions for future research. We first briefly review the early history of psychological safety research and then examine contemporary research at the individual, group, and organizational levels of analysis. We assess what has been learned and discuss suggestions for future theoretical development and methodological approaches for organizational behavior research on this important interpersonal construct.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework

              Background The Promoting Action on Research Implementation in Health Services, or PARIHS, framework is a theoretical framework widely promoted as a guide to implement evidence-based clinical practices. However, it has as yet no pool of validated measurement instruments that operationalize the constructs defined in the framework. The present article introduces an Organizational Readiness to Change Assessment instrument (ORCA), organized according to the core elements and sub-elements of the PARIHS framework, and reports on initial validation. Methods We conducted scale reliability and factor analyses on cross-sectional, secondary data from three quality improvement projects (n = 80) conducted in the Veterans Health Administration. In each project, identical 77-item ORCA instruments were administered to one or more staff from each facility involved in quality improvement projects. Items were organized into 19 subscales and three primary scales corresponding to the core elements of the PARIHS framework: (1) Strength and extent of evidence for the clinical practice changes represented by the QI program, assessed with four subscales, (2) Quality of the organizational context for the QI program, assessed with six subscales, and (3) Capacity for internal facilitation of the QI program, assessed with nine subscales. Results Cronbach's alpha for scale reliability were 0.74, 0.85 and 0.95 for the evidence, context and facilitation scales, respectively. The evidence scale and its three constituent subscales failed to meet the conventional threshold of 0.80 for reliability, and three individual items were eliminated from evidence subscales following reliability testing. In exploratory factor analysis, three factors were retained. Seven of the nine facilitation subscales loaded onto the first factor; five of the six context subscales loaded onto the second factor; and the three evidence subscales loaded on the third factor. Two subscales failed to load significantly on any factor. One measured resources in general (from the context scale), and one clinical champion role (from the facilitation scale). Conclusion We find general support for the reliability and factor structure of the ORCA. However, there was poor reliability among measures of evidence, and factor analysis results for measures of general resources and clinical champion role did not conform to the PARIHS framework. Additional validation is needed, including criterion validation.
                Bookmark

                Author and article information

                Contributors
                heather.gilmartin@va.gov
                Journal
                Learn Health Syst
                Learn Health Syst
                10.1002/(ISSN)2379-6146
                LRH2
                Learning Health Systems
                John Wiley and Sons Inc. (Hoboken )
                2379-6146
                08 April 2020
                April 2021
                : 5
                : 2 ( doiID: 10.1002/lrh2.v5.2 )
                : e10227
                Affiliations
                [ 1 ] Denver/Seattle Center of Innovation for Veteran‐Centered and Value Driven Care VHA Eastern Colorado Healthcare System Aurora Colorado USA
                [ 2 ] Health Systems, Management, and Policy University of Colorado, School of Public Health Aurora Colorado USA
                [ 3 ] Clinical Assessment Reporting and Tracking Program VHA Eastern Colorado Healthcare System Aurora Colorado USA
                Author notes
                [*] [* ] Correspondence

                Heather M. Gilmartin, Denver/Seattle Center of Innovation for Veteran‐Centered and Value Driven Care, VHA Eastern Colorado Healthcare System, 1700 N. Wheeling St, Aurora, CO 80045, USA.

                Email: heather.gilmartin@ 123456va.gov

                Author information
                https://orcid.org/0000-0002-0264-4059
                Article
                LRH210227
                10.1002/lrh2.10227
                8051348
                af168442-f28c-44b3-9c34-4f8c461df655
                Published 2020. This article is a U.S. Government work and is in the public domain in the USA. Learning Health Systems published by Wiley Periodicals LLC on behalf of University of Michigan.

                This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.

                History
                : 26 February 2020
                : 11 August 2019
                : 15 March 2020
                Page count
                Figures: 1, Tables: 2, Pages: 6, Words: 4645
                Funding
                Funded by: U.S. Department of Veterans Affairs , open-funder-registry 10.13039/100000738;
                Award ID: IKHX002567
                Categories
                Brief
                Briefs
                Custom metadata
                2.0
                April 2021
                Converter:WILEY_ML3GV2_TO_JATSPMC version:6.0.2 mode:remove_FC converted:16.04.2021

                high reliability organization,interventional cardiology,learning healthcare,veterans

                Comments

                Comment on this article