18
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Health professions digital education on clinical practice guidelines: a systematic review by Digital Health Education collaboration

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Clinical practice guidelines are an important source of information, designed to help clinicians integrate research evidence into their clinical practice. Digital education is increasingly used for clinical practice guideline dissemination and adoption. Our aim was to evaluate the effectiveness of digital education in improving the adoption of clinical practice guidelines.

          Methods

          We performed a systematic review and searched seven electronic databases from January 1990 to September 2018. Two reviewers independently screened studies, extracted data and assessed risk of bias. We included studies in any language evaluating the effectiveness of digital education on clinical practice guidelines compared to other forms of education or no intervention in healthcare professionals. We used the Grading of Recommendations, Assessment, Development and Evaluations (GRADE) approach to assess the quality of the body of evidence.

          Results

          Seventeen trials involving 2382 participants were included. The included studies were diverse with a largely unclear or high risk of bias. They mostly focused on physicians, evaluated computer-based interventions with limited interactivity and measured participants’ knowledge and behaviour. With regard to knowledge, studies comparing the effect of digital education with no intervention showed a moderate, statistically significant difference in favour of digital education intervention (SMD = 0.85, 95% CI 0.16, 1.54; I 2 = 83%, n = 3, moderate quality of evidence). Studies comparing the effect of digital education with traditional learning on knowledge showed a small, statistically non-significant difference in favour of digital education (SMD = 0.23, 95% CI − 0.12, 0.59; I 2 = 34%, n = 3, moderate quality of evidence). Three studies measured participants’ skills and reported mixed results. Of four studies measuring satisfaction, three studies favoured digital education over traditional learning. Of nine studies evaluating healthcare professionals’ behaviour change, only one study comparing email-delivered, spaced education intervention to no intervention reported improvement in the intervention group. Of three studies reporting patient outcomes, only one study comparing email-delivered, spaced education games to non-interactive online resources reported modest improvement in the intervention group. The quality of evidence for outcomes other than knowledge was mostly judged as low due to risk of bias, imprecision and/or inconsistency.

          Conclusions

          Health professions digital education on clinical practice guidelines is at least as effective as traditional learning and more effective than no intervention in terms of knowledge. Most studies report little or no difference in healthcare professionals’ behaviours and patient outcomes. The only intervention shown to improve healthcare professionals’ behaviour and modestly patient outcomes was email-delivered, spaced education. Future research should evaluate interactive, simulation-based and spaced forms of digital education and report on outcomes such as skills, behaviour, patient outcomes and cost.

          Electronic supplementary material

          The online version of this article (10.1186/s12916-019-1370-1) contains supplementary material, which is available to authorized users.

          Related collections

          Most cited references30

          • Record: found
          • Abstract: found
          • Article: not found

          Technology-enhanced simulation for health professions education: a systematic review and meta-analysis.

          Although technology-enhanced simulation has widespread appeal, its effectiveness remains uncertain. A comprehensive synthesis of evidence may inform the use of simulation in health professions education. To summarize the outcomes of technology-enhanced simulation training for health professions learners in comparison with no intervention. Systematic search of MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous review bibliographies through May 2011. Original research in any language evaluating simulation compared with no intervention for training practicing and student physicians, nurses, dentists, and other health care professionals. Reviewers working in duplicate evaluated quality and abstracted information on learners, instructional design (curricular integration, distributing training over multiple days, feedback, mastery learning, and repetitive practice), and outcomes. We coded skills (performance in a test setting) separately for time, process, and product measures, and similarly classified patient care behaviors. From a pool of 10,903 articles, we identified 609 eligible studies enrolling 35,226 trainees. Of these, 137 were randomized studies, 67 were nonrandomized studies with 2 or more groups, and 405 used a single-group pretest-posttest design. We pooled effect sizes using random effects. Heterogeneity was large (I(2)>50%) in all main analyses. In comparison with no intervention, pooled effect sizes were 1.20 (95% CI, 1.04-1.35) for knowledge outcomes (n = 118 studies), 1.14 (95% CI, 1.03-1.25) for time skills (n = 210), 1.09 (95% CI, 1.03-1.16) for process skills (n = 426), 1.18 (95% CI, 0.98-1.37) for product skills (n = 54), 0.79 (95% CI, 0.47-1.10) for time behaviors (n = 20), 0.81 (95% CI, 0.66-0.96) for other behaviors (n = 50), and 0.50 (95% CI, 0.34-0.66) for direct effects on patients (n = 32). Subgroup analyses revealed no consistent statistically significant interactions between simulation training and instructional design features or study quality. In comparison with no intervention, technology-enhanced simulation training in health professions education is consistently associated with large effects for outcomes of knowledge, skills, and behaviors and moderate effects for patient-related outcomes.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial.

            Despite evidence that a variety of continuing medical education (CME) techniques can foster physician behavioral change, there have been no randomized trials comparing performance outcomes for physicians participating in Internet-based CME with physicians participating in a live CME intervention using approaches documented to be effective. To determine if Internet-based CME can produce changes comparable to those produced via live, small-group, interactive CME with respect to physician knowledge and behaviors that have an impact on patient care. Randomized controlled trial conducted from August 2001 to July 2002. Participants were 97 primary care physicians drawn from 21 practice sites in Houston, Tex, including 7 community health centers and 14 private group practices. A control group of 18 physicians from these same sites received no intervention. Physicians were randomly assigned to an Internet-based CME intervention that could be completed in multiple sessions over 2 weeks, or to a single live, small-group, interactive CME workshop. Both incorporated similar multifaceted instructional approaches demonstrated to be effective in live settings. Content was based on the National Institutes of Health National Cholesterol Education Program--Adult Treatment Panel III guidelines. Knowledge was assessed immediately before the intervention, immediately after the intervention, and 12 weeks later. The percentage of high-risk patients who had appropriate lipid panel screening and pharmacotherapeutic treatment according to guidelines was documented with chart audits conducted over a 5-month period before intervention and a 5-month period after intervention. Both interventions produced similar and significant immediate and 12-week knowledge gains, representing large increases in percentage of items correct (pretest to posttest: 31.0% [95% confidence interval {CI}, 27.0%-35.0%]; pretest to 12 weeks: 36.4% [95% CI, 32.2%-40.6%]; P or =93%) with no significant postintervention change. However, the Internet-based intervention was associated with a significant increase in the percentage of high-risk patients treated with pharmacotherapeutics according to guidelines (preintervention, 85.3%; postintervention, 90.3%; P = .04). Appropriately designed, evidence-based online CME can produce objectively measured changes in behavior as well as sustained gains in knowledge that are comparable or superior to those realized from effective live activities.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Effects of continuing medical education on improving physician clinical care and patient health: a review of systematic reviews.

              The objective of physician continuing medical education (CME) is to help them keep abreast of advances in patient care, to accept new more-beneficial care, and discontinue use of existing lower-benefit diagnostic and therapeutic interventions. The goal of this review was to examine effectiveness of current CME tools and techniques in changing physician clinical practices and improving patient health outcomes. Results of published systematic reviews were examined to determine the spectrum from most- to least-effective CME techniques. We searched multiple databases, from 1 January 1984 to 30 October 2004, for English-language, peer-reviewed meta-analyses and other systematic reviews of CME programs that alter physician behavior and/or patient outcomes. Twenty-six reviews met inclusion criteria, that is, were either formal meta-analyses or other systematic reviews. Interactive techniques (audit/feedback, academic detailing/outreach, and reminders) are the most effective at simultaneously changing physician care and patient outcomes. Clinical practice guidelines and opinion leaders are less effective. Didactic presentations and distributing printed information only have little or no beneficial effect in changing physician practice. Even though the most-effective CME techniques have been proven, use of least-effective ones predominates. Such use of ineffective CME likely reduces patient care quality and raises costs for all, the worst of both worlds.
                Bookmark

                Author and article information

                Contributors
                lorainne.tudor.car@ntu.edu.sg
                Journal
                BMC Med
                BMC Med
                BMC Medicine
                BioMed Central (London )
                1741-7015
                18 July 2019
                18 July 2019
                2019
                : 17
                : 139
                Affiliations
                [1 ]ISNI 0000 0001 2224 0361, GRID grid.59025.3b, Family Medicine and Primary Care, Lee Kong Chian School of Medicine, , Nanyang Technological University Singapore, ; 11 Mandalay Road, Level 18, Clinical Science Building, Singapore, 308232 Singapore
                [2 ]ISNI 0000 0001 2113 8111, GRID grid.7445.2, Department of Primary Care and Public Health, School of Public Health, , Imperial College London, ; London, UK
                [3 ]ISNI 0000 0001 2224 0361, GRID grid.59025.3b, Lee Kong Chian School of Medicine, , Nanyang Technological University Singapore, ; Singapore, Singapore
                [4 ]ISNI 0000 0001 2224 0361, GRID grid.59025.3b, Medical Education Research Unit, Lee Kong Chian School of Medicine, , Nanyang Technological University Singapore, ; Singapore, Singapore
                Author information
                http://orcid.org/0000-0001-8414-7664
                Article
                1370
                10.1186/s12916-019-1370-1
                6637541
                31315642
                9f0cd4ff-a6bd-4431-812c-d609f94ba961
                © The Author(s). 2019

                Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

                History
                : 7 November 2018
                : 17 June 2019
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100011738, Lee Kong Chian School of Medicine, Nanyang Technological University;
                Award ID: Start-Up Grant
                Award Recipient :
                Categories
                Research Article
                Custom metadata
                © The Author(s) 2019

                Medicine
                clinical practice guidelines,health professions education,systematic review,digital education

                Comments

                Comment on this article