28
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Data Analysis and Synthesis Within a Realist Evaluation: Toward More Transparent Methodological Approaches

      1 , 2 , 3 , 1 , 2 , 4 , 1
      International Journal of Qualitative Methods
      SAGE Publications

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Realist evaluations are increasingly used in the study of complex health interventions. The methodological procedures applied within realist evaluations however are often inexplicit, prompting scholars to call for increased transparency and more detailed description within realist studies. This publication details the data analysis and synthesis process used within two realist evaluation studies of community health interventions taking place across Uganda, Tanzania, and Kenya. Using data from several case studies across all three countries and the data analysis software NVivo, we describe in detail how data were analyzed and subsequently synthesized to refine middle-range theories. We conclude by discussing the strengths and weaknesses of the approach taken, providing novel methodological recommendations. The aim of providing this detailed descriptive account of the analysis and synthesis in these two studies is to promote transparency and contribute to the advancement of realist evaluation methodologies.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Using realist evaluation to open the black box of knowledge translation: a state-of-the-art review

          Background In knowledge translation, complex interventions may be implemented in the attempt to improve uptake of research-based knowledge in practice. Traditional evaluation efforts that focus on aggregate effectiveness represent an oversimplification of both the environment and the interventions themselves. However, theory-based approaches to evaluation, such as realist evaluation (RE), may be better-suited to examination of complex knowledge translation interventions with a view to understanding what works, for whom, and under what conditions. It is the aim of the present state-of-the-art review to examine current literature with regard to the use of RE in the assessment of knowledge translation interventions implemented within healthcare environments. Methods Multiple online databases were searched from 1997 through June 2013. Primary studies examining the application or implementation of knowledge translation interventions within healthcare settings and using RE were selected for inclusion. Varying applications of RE across studies were examined in terms of a) reporting of core elements of RE, and b) potential feasibility of this evaluation method. Results A total of 14 studies (6 study protocols), published between 2007 and 2013, were identified for inclusion. Projects were initiated in a variety of healthcare settings and represented a range of interventions. While a majority of authors mentioned context (C), mechanism (M) and outcome (O), a minority reported the development of C-M-O configurations or testable hypotheses based on these configurations. Four completed studies reported results that included refinement of proposed C-M-O configurations and offered explanations within the RE framework. In the few studies offering insight regarding challenges associated with the use of RE, difficulties were expressed regarding the definition of both mechanisms and contextual factors. Overall, RE was perceived as time-consuming and resource intensive. Conclusions The use of RE in knowledge translation is relatively new; however, theory-building approaches to the examination of complex interventions in this area may be increasing as researchers attempt to identify what works, for whom and under what circumstances. Completion of the RE cycle may be challenging, particularly in the development of C-M-O configurations; however, as researchers approach challenges and explore innovations in its application, rich and detailed accounts may improve feasibility.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Collective action for implementation: a realist evaluation of organisational collaboration in healthcare

            Background Increasingly, it is being suggested that translational gaps might be eradicated or narrowed by bringing research users and producers closer together, a theory that is largely untested. This paper reports a national study to fill a gap in the evidence about the conditions, processes and outcomes related to collaboration and implementation. Methods A longitudinal realist evaluation using multiple qualitative methods case studies was conducted with three Collaborations for Leadership in Applied Health Research in Care (England). Data were collected over four rounds of theory development, refinement and testing. Over 200 participants were involved in semi-structured interviews, non-participant observations of events and meetings, and stakeholder engagement. A combined inductive and deductive data analysis process was focused on proposition refinement and testing iteratively over data collection rounds. Results The quality of existing relationships between higher education and local health service, and views about whether implementation was a collaborative act, created a path dependency. Where implementation was perceived to be removed from service and there was a lack of organisational connections, this resulted in a focus on knowledge production and transfer, rather than co-production. The collaborations’ architectures were counterproductive because they did not facilitate connectivity and had emphasised professional and epistemic boundaries. More distributed leadership was associated with greater potential for engagement. The creation of boundary spanning roles was the most visible investment in implementation, and credible individuals in these roles resulted in cross-boundary work, in facilitation and in direct impacts. The academic-practice divide played out strongly as a context for motivation to engage, in that ‘what’s in it for me’ resulted in variable levels of engagement along a co-operation-collaboration continuum. Learning within and across collaborations was patchy depending on attention to evaluation. Conclusions These collaborations did not emerge from a vacuum, and they needed time to learn and develop. Their life cycle started with their position on collaboration, knowledge and implementation. More impactful attempts at collective action in implementation might be determined by the deliberate alignment of a number of features, including foundational relationships, vision, values, structures and processes and views about the nature of the collaboration and implementation.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              A realistic evaluation: the case of protocol-based care

              Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice.
                Bookmark

                Author and article information

                Journal
                International Journal of Qualitative Methods
                International Journal of Qualitative Methods
                SAGE Publications
                1609-4069
                1609-4069
                July 03 2019
                January 2019
                July 03 2019
                January 2019
                : 18
                : 160940691985975
                Affiliations
                [1 ]Trinity Centre for Global Health, School of Psychology, Trinity College Dublin, Dublin, Ireland
                [2 ]Trinity Centre for Global Health, School of Medicine, Trinity College Dublin, Dublin, Ireland
                [3 ]School of Nursing, Midwifery and Health Systems, College of Health and Agricultural Sciences, University College Dublin, Dublin, Ireland
                [4 ]Health Research Board Trials Methodology Research Network, Ireland
                Article
                10.1177/1609406919859754
                2e4d67ac-233d-44d8-a7c5-9f7dff4979d8
                © 2019

                http://journals.sagepub.com/page/policies/text-and-data-mining-license

                History

                Comments

                Comment on this article