Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      What Really Matters for Supervision Training Workshops? A Realist Evaluation

      research-article
      , PhD, MN 1 , , PhD, MEd 2 , , , PhD, APD 3 , , APD 4 , , MSc, AHPRA 5 , , MRes 6 , , PhD, APAM 7 , , PhD, MPH, MNutDiet, APD 8
      Academic Medicine
      Lippincott Williams & Wilkins

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Purpose

          Supervision training supports health care supervisors to perform their essential functions. Realist evaluations are increasingly popular for evaluating complex educational interventions, but no such evaluations exist appraising supervision workshops. Building on an earlier realist synthesis of supervision training, the authors evaluated whether supervision workshops work, for whom and under what circumstances, and why.

          Method

          The authors conducted a 2-stage realist evaluation during 2018–2019 to refine and develop program theory. The intervention involved half-day, face-to-face supervision workshops as part of an Australian state-wide government-funded program for health care and human services supervisors. Data collection involved realist interviews with 10 workshop developers (stage 1) and 43 supervisors (stage 2). The authors employed team-based data analysis using realist logic to refine and develop program theory by identifying contexts, mechanisms, outcomes, and context-mechanism-outcome configurations.

          Results

          Despite their brevity, the supervision workshops had many reported benefits for supervisors (e.g., improved satisfaction) through various perceived mechanisms pertaining to pedagogy (e.g., mixed pedagogies), workshops (e.g., optimal duration), and individuals (e.g., supervisor engagement). However, they also yielded negative reported outcomes (e.g., suboptimal knowledge gains) brought about by assorted perceived mechanisms related to pedagogy (e.g., suboptimal peer learning), workshops (e.g., content irrelevance), and individuals (e.g., suboptimal facilitator competence). Such mechanisms were thought to be triggered by diverse contexts including supervisors’ levels of experience, sector, and workplace supervision cultures.

          Conclusions

          While the findings partly support the realist synthesis of supervision training and previous realist evaluations of faculty development, this realist evaluation extends this literature considerably. Health care educators should employ mixed pedagogies (e.g., didactic teaching, peer learning), relevant content, optimal workshop duration, and competent/engaging facilitators. Educators also need to tailor workshops according to supervisors’ contexts including the sectors and supervision cultures in which supervision is practiced, and supervisors’ levels of experience (e.g., experienced supervisors appreciated workshop brevity).

          Related collections

          Most cited references46

          • Record: found
          • Abstract: found
          • Article: not found

          Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

          Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            RAMESES publication standards: realist syntheses

            Background There is growing interest in realist synthesis as an alternative systematic review method. This approach offers the potential to expand the knowledge base in policy-relevant areas - for example, by explaining the success, failure or mixed fortunes of complex interventions. No previous publication standards exist for reporting realist syntheses. This standard was developed as part of the RAMESES (Realist And MEta-narrative Evidence Syntheses: Evolving Standards) project. The project's aim is to produce preliminary publication standards for realist systematic reviews. Methods We (a) collated and summarized existing literature on the principles of good practice in realist syntheses; (b) considered the extent to which these principles had been followed by published syntheses, thereby identifying how rigor may be lost and how existing methods could be improved; (c) used a three-round online Delphi method with an interdisciplinary panel of national and international experts in evidence synthesis, realist research, policy and/or publishing to produce and iteratively refine a draft set of methodological steps and publication standards; (d) provided real-time support to ongoing realist syntheses and the open-access RAMESES online discussion list so as to capture problems and questions as they arose; and (e) synthesized expert input, evidence syntheses and real-time problem analysis into a definitive set of standards. Results We identified 35 published realist syntheses, provided real-time support to 9 on-going syntheses and captured questions raised in the RAMESES discussion list. Through analysis and discussion within the project team, we summarized the published literature and common questions and challenges into briefing materials for the Delphi panel, comprising 37 members. Within three rounds this panel had reached consensus on 19 key publication standards, with an overall response rate of 91%. Conclusion This project used multiple sources to develop and draw together evidence and expertise in realist synthesis. For each item we have included an explanation for why it is important and guidance on how it might be reported. Realist synthesis is a relatively new method for evidence synthesis and as experience and methodological developments occur, we anticipate that these standards will evolve to reflect further methodological developments. We hope that these standards will act as a resource that will contribute to improving the reporting of realist syntheses. To encourage dissemination of the RAMESES publication standards, this article is co-published in the Journal of Advanced Nursing and is freely accessible on Wiley Online Library (http://www.wileyonlinelibrary.com/journal/jan). Please see related article http://www.biomedcentral.com/1741-7015/11/20 and http://www.biomedcentral.com/1741-7015/11/22
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              RAMESES II reporting standards for realist evaluations

              Background Realist evaluation is increasingly used in health services and other fields of research and evaluation. No previous standards exist for reporting realist evaluations. This standard was developed as part of the RAMESES II project. The project’s aim is to produce initial reporting standards for realist evaluations. Methods We purposively recruited a maximum variety sample of an international group of experts in realist evaluation to our online Delphi panel. Panel members came from a variety of disciplines, sectors and policy fields. We prepared the briefing materials for our Delphi panel by summarising the most recent literature on realist evaluations to identify how and why rigour had been demonstrated and where gaps in expertise and rigour were evident. We also drew on our collective experience as realist evaluators, in training and supporting realist evaluations, and on the RAMESES email list to help us develop the briefing materials. Through discussion within the project team, we developed a list of issues related to quality that needed to be addressed when carrying out realist evaluations. These were then shared with the panel members and their feedback was sought. Once the panel members had provided their feedback on our briefing materials, we constructed a set of items for potential inclusion in the reporting standards and circulated these online to panel members. Panel members were asked to rank each potential item twice on a 7-point Likert scale, once for relevance and once for validity. They were also encouraged to provide free text comments. Results We recruited 35 panel members from 27 organisations across six countries from nine different disciplines. Within three rounds our Delphi panel was able to reach consensus on 20 items that should be included in the reporting standards for realist evaluations. The overall response rates for all items for rounds 1, 2 and 3 were 94 %, 76 % and 80 %, respectively. Conclusion These reporting standards for realist evaluations have been developed by drawing on a range of sources. We hope that these standards will lead to greater consistency and rigour of reporting and make realist evaluation reports more accessible, usable and helpful to different stakeholders.
                Bookmark

                Author and article information

                Contributors
                Journal
                Acad Med
                Acad Med
                ACM
                Academic Medicine
                Lippincott Williams & Wilkins (Hagerstown, MD )
                1040-2446
                1938-808X
                21 July 2022
                August 2022
                : 97
                : 8
                : 1203-1212
                Affiliations
                [1 ] V.N.B. Nguyen is a research fellow, Monash Nursing and Midwifery, Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0002-0982-2532.
                [2 ] C.E. Rees is head of school, School of Health Sciences, College of Health, Medicine and Wellbeing, The University of Newcastle, Callaghan, NSW, Australia, and adjunct professor, Monash Centre for Scholarship in Health Education, Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0003-4828-1422.
                [3 ] E. Ottrey is a research fellow, Monash Centre for Scholarship in Health Education, Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0002-2979-548X.
                [4 ] C. Davis is a PhD candidate, Department of Nutrition, Dietetics and Food, School of Clinical Sciences at Monash Health, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0002-6343-2260.
                [5 ] K. Pope is a lecturer, Department of Occupational Therapy, Faculty of Medicine, Nursing and Health Sciences, Monash University, Frankston, VIC, Australia; ORCID: https://orcid.org/0000-0002-0010-4091.
                [6 ] S. Lee is a PhD candidate, Monash Centre for Scholarship in Health Education, Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0002-2781-3082.
                [7 ] S. Waller is an adjunct senior research fellow, School of Rural Health, Faculty of Medicine, Nursing and Health Sciences, Monash University, Bendigo, VIC, Australia, and assistant professor, Department of Medical Education, College of Medicine and Health Sciences, United Arab Emirates University, Al Ain, UAE; ORCID: https://orcid.org/0000-0002-6309-0360.
                [8 ] C. Palermo is director, Monash Centre for Scholarship in Health Education, Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia, and associate dean (teaching and learning), Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, VIC, Australia; ORCID: https://orcid.org/0000-0002-9423-5067.
                Author notes
                Correspondence should be addressed to Charlotte E. Rees, School of Health Sciences, College of Health, Medicine and Wellbeing, The University of Newcastle, Callaghan, NSW, Australia; telephone: +61 (0)2 492 17284; email: charlotte.rees@ 123456newcastle.edu.au ; Twitter: @charlreessidhu.
                Article
                00048
                10.1097/ACM.0000000000004686
                9311464
                35385398
                3d2f5ddc-f235-4671-b302-aa472ad15b9a
                Copyright © 2022 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the Association of American Medical Colleges.

                This is an open access article distributed under the Creative Commons Attribution License 4.0 (CCBY), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                Categories
                Research Reports
                Custom metadata
                TRUE
                T

                Comments

                Comment on this article