115
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      What Are the Effects of Teaching Evidence-Based Health Care (EBHC)? Overview of Systematic Reviews

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          An evidence-based approach to health care is recognized internationally as a key competency for healthcare practitioners. This overview systematically evaluated and organized evidence from systematic reviews on teaching evidence-based health care (EBHC).

          Methods/Findings

          We searched for systematic reviews evaluating interventions for teaching EBHC to health professionals compared to no intervention or different strategies. Outcomes covered EBHC knowledge, skills, attitudes, practices and health outcomes. Comprehensive searches were conducted in April 2013. Two reviewers independently selected eligible reviews, extracted data and evaluated methodological quality. We included 16 systematic reviews, published between 1993 and 2013. There was considerable overlap across reviews. We found that 171 source studies included in the reviews related to 81 separate studies, of which 37 are in more than one review. Studies used various methodologies to evaluate educational interventions of varying content, format and duration in undergraduates, interns, residents and practicing health professionals. The evidence in the reviews showed that multifaceted, clinically integrated interventions, with assessment, led to improvements in knowledge, skills and attitudes. Interventions improved critical appraisal skills and integration of results into decisions, and improved knowledge, skills, attitudes and behaviour amongst practicing health professionals. Considering single interventions, EBHC knowledge and attitude were similar for lecture-based versus online teaching. Journal clubs appeared to increase clinical epidemiology and biostatistics knowledge and reading behavior, but not appraisal skills. EBHC courses improved appraisal skills and knowledge. Amongst practicing health professionals, interactive online courses with guided critical appraisal showed significant increase in knowledge and appraisal skills. A short workshop using problem-based approaches, compared to no intervention, increased knowledge but not appraisal skills.

          Conclusions

          EBHC teaching and learning strategies should focus on implementing multifaceted, clinically integrated approaches with assessment. Future rigorous research should evaluate minimum components for multifaceted interventions, assessment of medium to long-term outcomes, and implementation of these interventions.

          Related collections

          Most cited references29

          • Record: found
          • Abstract: found
          • Article: not found

          Instruments for evaluating education in evidence-based practice: a systematic review.

          Evidence-based practice (EBP) is the integration of the best research evidence with patients' values and clinical circumstances in clinical decision making. Teaching of EBP should be evaluated and guided by evidence of its own effectiveness. To appraise, summarize, and describe currently available EBP teaching evaluation instruments. We searched the MEDLINE, EMBASE, CINAHL, HAPI, and ERIC databases; reference lists of retrieved articles; EBP Internet sites; and 8 education journals from 1980 through April 2006. For inclusion, studies had to report an instrument evaluating EBP, contain sufficient description to permit analysis, and present quantitative results of administering the instrument. Two raters independently abstracted information on the development, format, learner levels, evaluation domains, feasibility, reliability, and validity of the EBP evaluation instruments from each article. We defined 3 levels of instruments based on the type, extent, methods, and results of psychometric testing and suitability for different evaluation purposes. Of 347 articles identified, 115 were included, representing 104 unique instruments. The instruments were most commonly administered to medical students and postgraduate trainees and evaluated EBP skills. Among EBP skills, acquiring evidence and appraising evidence were most commonly evaluated, but newer instruments evaluated asking answerable questions and applying evidence to individual patients. Most behavior instruments measured the performance of EBP steps in practice but newer instruments documented the performance of evidence-based clinical maneuvers or patient-level outcomes. At least 1 type of validity evidence was demonstrated for 53% of instruments, but 3 or more types of validity evidence were established for only 10%. High-quality instruments were identified for evaluating the EBP competence of individual trainees, determining the effectiveness of EBP curricula, and assessing EBP behaviors with objective outcome measures. Instruments with reasonable validity are available for evaluating some domains of EBP and may be targeted to different evaluation needs. Further development and testing is required to evaluate EBP attitudes, behaviors, and more recently articulated EBP skills.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine

            Background A variety of methods exists for teaching and learning evidence-based medicine (EBM). However, there is much debate about the effectiveness of various EBM teaching and learning activities, resulting in a lack of consensus as to what methods constitute the best educational practice. There is a need for a clear hierarchy of educational activities to effectively impart and acquire competence in EBM skills. This paper develops such a hierarchy based on current empirical and theoretical evidence. Discussion EBM requires that health care decisions be based on the best available valid and relevant evidence. To achieve this, teachers delivering EBM curricula need to inculcate amongst learners the skills to gain, assess, apply, integrate and communicate new knowledge in clinical decision-making. Empirical and theoretical evidence suggests that there is a hierarchy of teaching and learning activities in terms of their educational effectiveness: Level 1, interactive and clinically integrated activities; Level 2(a), interactive but classroom based activities; Level 2(b), didactic but clinically integrated activities; and Level 3, didactic, classroom or standalone teaching. Summary All health care professionals need to understand and implement the principles of EBM to improve care of their patients. Interactive and clinically integrated teaching and learning activities provide the basis for the best educational practice in this field.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              How to run an effective journal club: a systematic review.

              Health-based journal clubs have been in place for over 100 years. Participants meet regularly to critique research articles, to improve their understanding of research design, statistics and critical appraisal. However, there is no standard process of conducting an effective journal club. We conducted a systematic literature review to identify core processes of a successful health journal club. We searched a range of library databases using established keywords. All research designs were initially considered to establish the body of evidence. Experimental or comparative papers were then critically appraised for methodological quality and information was extracted on effective journal club processes. We identified 101 articles, of which 21 comprised the body of evidence. Of these, 12 described journal club effectiveness. Methodological quality was moderate. The papers described many processes of effective journal clubs. Over 80% papers reported that journal club intervention was effective in improving knowledge and critical appraisal skills. Few papers reported on the psychometric properties of their outcome instruments. No paper reported on the translation of evidence from journal club into clinical practice. Characteristics of successful journal clubs included regular and anticipated meetings, mandatory attendance, clear long- and short-term purpose, appropriate meeting timing and incentives, a trained journal club leader to choose papers and lead discussion, circulating papers prior to the meeting, using the internet for wider dissemination and data storage, using established critical appraisal processes and summarizing journal club findings.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, USA )
                1932-6203
                2014
                28 January 2014
                : 9
                : 1
                : e86706
                Affiliations
                [1 ]Centre for Evidence-based Health Care, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
                [2 ]South African Cochrane Centre, South African Medical Research Council, Cape Town, South Africa
                [3 ]Community Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
                [4 ]All Ireland Hub for Trials Methodology Research, Queen’s University Belfast, Belfast, Northern Ireland
                University of York, United Kingdom
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                Conceived and designed the experiments: TY AR MC JV. Performed the experiments: TY AR. Analyzed the data: TY AR JV MC. Contributed reagents/materials/analysis tools: TY AR JV MC. Wrote the paper: TY AR MC JV. Developed the protocol: TY. Contributed to the background development: AR. Provided comments on the methods: AR. Screened search outputs: TY AR. Independently extracted data: TY AR. Assessed methodological quality of included systematic reviews: TY AR. Led the write up of the review: TY. Critically engaged and provided input on the results, discussion and conclusions: AR. Provided comments on the protocol for the overview: JV MC. Provided methodological guidance: JV MC. Critically evaluated the manuscript: JV MC.

                Article
                PONE-D-13-39803
                10.1371/journal.pone.0086706
                3904944
                24489771
                c4471c1f-b86d-4c35-a11e-da6c99e1819d
                Copyright @ 2014

                This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 28 September 2013
                : 9 December 2013
                Page count
                Pages: 13
                Funding
                TY and AR are supported in part by the Effective Health Care Research Consortium, which is funded by UKaid from the UK Government Department for International Development, www.evidence4health.org. This research has been supported in part by the US President’s Emergency Plan for AIDS relief (PEPFAR) through HRSA under the terms of T84HA21652 and via the Stellenbosch University Rural Medical Education Partnership Initiative (SURMEPI). This work is based on the research supported in part by the National Research Foundation of South Africa (UNIQUE GRANT NO 86420). The All Ireland Hub for Trials Methodology Research is supported by the UK Medical Research Council (G0901530), Queen’s University Belfast, the University of Ulster and the Health and Social Care R&D Division of the Public Health Agency of Northern Ireland. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Research Article
                Medicine
                Clinical Research Design
                Systematic Reviews
                Epidemiology
                Clinical Epidemiology
                Non-Clinical Medicine
                Health Care Policy
                Health Education and Awareness
                Health Systems Strengthening
                Academic Medicine
                Evidence-Based Medicine
                Medical Education
                Nursing Science
                Nursing Education
                Science Policy
                Science Education

                Uncategorized
                Uncategorized

                Comments

                Comment on this article