5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Telementoring for remote simulation instructor training and faculty development using telesimulation

      , , , ,
      BMJ Simulation and Technology Enhanced Learning
      BMJ

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction

          Simulation-based training is essential for high-quality medical care, but it requires access to equipment and expertise. Technology can facilitate connecting educators to training in simulation. We aimed to explore the use of remote simulation faculty development in Latvia using telesimulation and telementoring with an experienced debriefer located in the USA.

          Methods

          This was a prospective, simulation-based longitudinal study. Over the course of 16 months, a remote simulation instructor (RI) from the USA and a local instructor (LI) in Latvia cofacilitated with teleconferencing. Responsibility gradually transitioned from the RI to the LI. At the end of each session, students completed the Debriefing Assessment for Simulation in Healthcare (DASH) student version form (DASH-SV) and a general feedback form, and the LI completed the instructor version of the DASH form (DASH-IV). Outcome measures were the changes in DASH scores over time.

          Results

          A total of eight simulation sessions were cofacilitated of 16 months. As the role of the LI increased over time, the debrief quality measured with the DASH-IV did not change significantly (from 89 to 87), although the DASH-SV score decreased from a total median score of 89 (IQR 86–98) to 80 (IQR 78–85) (p =0.005).

          Conclusion

          In this study, telementoring with telesimulations resulted in high-quality debriefing. The quality—perceived by the students—was higher with the involvement of the remote instructor and declined during the transition to the LI. This concept requires further investigation and could potentially build local simulation expertise promoting sustainability of high-quality simulation.

          Related collections

          Most cited references24

          • Record: found
          • Abstract: found
          • Article: not found

          A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research.

          Intraclass correlation coefficient (ICC) is a widely used reliability index in test-retest, intrarater, and interrater reliability analyses. This article introduces the basic concept of ICC in the content of reliability analysis.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Technology-enhanced simulation for health professions education: a systematic review and meta-analysis.

            Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. From a pool of 10,903 articles, we identified 609 eligible studies enrolling 35,226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I(2)>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing.

              We describe an integrated conceptual framework for a blended approach to debriefing called PEARLS [Promoting Excellence And Reflective Learning in Simulation]. We provide a rationale for scripted debriefing and introduce a PEARLS debriefing tool designed to facilitate implementation of the new framework. The PEARLS framework integrates 3 common educational strategies used during debriefing, namely, (1) learner self-assessment, (2) facilitating focused discussion, and (3) providing information in the form of directive feedback and/or teaching. The PEARLS debriefing tool incorporates scripted language to guide the debriefing, depending on the strategy chosen. The PEARLS framework and debriefing script fill a need for many health care educators learning to facilitate debriefings in simulation-based education. The PEARLS offers a structured framework adaptable for debriefing simulations with a variety in goals, including clinical decision making, improving technical skills, teamwork training, and interprofessional collaboration.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                (View ORCID Profile)
                Journal
                BMJ Simulation and Technology Enhanced Learning
                BMJ Simul Technol Enhanc Learn
                BMJ
                2056-6697
                March 03 2021
                March 2021
                March 2021
                May 18 2020
                : 7
                : 2
                : 61-65
                Article
                10.1136/bmjstel-2019-000512
                6ee5a224-1c9b-422b-ad2e-af0692871897
                © 2020
                History

                Comments

                Comment on this article