33
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      To submit your manuscript to JMIR, please click here

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How the “Understanding Research Evidence” Web-Based Video Series From the National Collaborating Centre for Methods and Tools Contributes to Public Health Capacity to Practice Evidence-Informed Decision Making: Mixed-Methods Evaluation

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The National Collaborating Centre for Methods and Tools (NCCMT) offers workshops and webinars to build public health capacity for evidence-informed decision-making. Despite positive feedback for NCCMT workshops and resources, NCCMT users found key terms used in research papers difficult to understand. The Understanding Research Evidence (URE) videos use plain language, cartoon visuals, and public health examples to explain complex research concepts. The videos are posted on the NCCMT website and YouTube channel.

          Objective

          The first four videos in the URE web-based video series, which explained odds ratios (ORs), confidence intervals (CIs), clinical significance, and forest plots, were evaluated. The evaluation examined how the videos affected public health professionals’ practice. A mixed-methods approach was used to examine the delivery mode and the content of the videos. Specifically, the evaluation explored (1) whether the videos were effective at increasing knowledge on the four video topics, (2) whether public health professionals were satisfied with the videos, and (3) how public health professionals applied the knowledge gained from the videos in their work.

          Methods

          A three-part evaluation was conducted to determine the effectiveness of the first four URE videos. The evaluation included a Web-based survey, telephone interviews, and pretest and posttests, which evaluated public health professionals’ experience with the videos and how the videos affected their public health work. Participants were invited to participate in this evaluation through various open access, public health email lists, through informational flyers and posters at the Canadian Public Health Association (CPHA) conference, and through targeted recruitment to NCCMT’s network.

          Results

          In the Web-based surveys (n=46), participants achieved higher scores on the knowledge assessment questions from watching the OR ( P=.04), CI ( P=.04), and clinical significance ( P=.05) videos but not the forest plot ( P=.12) video, as compared with participants who had not watched the videos. The pretest and posttest (n=124) demonstrated that participants had a better understanding of forest plots ( P<.001) and CIs ( P<.001) after watching the videos. Due to small sample size numbers, there were insufficient pretest and posttest data to conduct meaningful analyses on the clinical significance and OR videos. Telephone interview participants (n=18) thought the videos’ use of animation, narration, and plain language was appropriate for people with different levels of understanding and learning styles. Participants felt that by increasing their understanding of research evidence, they could develop better interventions and design evaluations to measure the impact of public health initiatives.

          Conclusions

          Overall, the results of the evaluation showed that watching the videos resulted in an increase in knowledge, and participants had an overall positive experience with the URE videos. With increased competence in using the best available evidence, professionals are empowered to contribute to decisions that can improve health outcomes of communities.

          Related collections

          Most cited references22

          • Record: found
          • Abstract: found
          • Article: not found

          Evidence-based public health: an evolving concept.

          Evidence-based public health (EBPH) has been proposed as a practice model that builds upon the success of evidence-based medicine (EBM). EBM has been described as a more scientific and systematic approach to the practice of medicine. It has enhanced medical training and practice in many settings. Both EBM and EBPH systematically use data, information, and scientific principles to enhance clinical care and population health, respectively. In this paper, we review the evolution of EBPH, propose a new definition for EBPH, and discuss developments that may support its further advancement.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            'The research compass': an introduction to research in medical education: AMEE Guide no. 56.

            This AMEE Guide offers an introduction to research in medical education. It is intended for those who are contemplating conducting research in medical education but are new to the field. The Guide is structured around the process of transforming ideas and problems into researchable questions, choosing a research approach that is appropriate to the purpose of the study and considering the individual researcher's preferences and the contextual possibilities and constraints. The first section of the Guide addresses the rationale for research in medical education and some of the challenges posed by the complexity of the field. Next is a section on how to move from an idea or problem to a research question by placing a concrete idea or problem within a conceptual, theoretical framework. The following sections are structured around an overview model of approaches to medical education research, 'The research compass'. Core to the model is the conceptual, theoretical framework that is the key to any direction. The compass depicts four main categories of research approaches that can be applied when studying medical education phenomena, 'Explorative studies'; 'Experimental studies'; 'Observational studies'; and 'Translational studies'. Future AMEE Guides in the research series will address these approaches in more detail.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Shifting sands - from descriptions to solutions.

              Public health practitioners and policymakers value research evidence as one of many resources to use in evidence-informed decision making (EIDM) for public health. However, both researchers and decision-makers have described persistent barriers and facilitators involved in using research evidence for public health practice and policy. This is likely to affect the extent to which research evidence is influential or useful in decisions. Numerous taxonomies, typologies and frameworks are available to guide action in EIDM, but their application in practice is relatively unknown. The Public Health Evidence group based in Australia, which incorporates The Cochrane Collaboration's Public Health Review Group, have adapted a number of conceptualizations of research use and types of evidence into a practical typology that defines and illustrates three main types of evidence used in evidence-informed public health: data (Type 1), intervention effectiveness (Type 2) and implementation evidence (Type 3). The authors have actively used this typology within our primary research, evidence synthesis, workforce development and stakeholder engagement strategies, which has enabled practical application of these concepts. To test the relevance of the typology in practice, relevant findings from our applied research and evaluation (including two exploratory studies of evidence use in decision-making and evaluations of the use and impact of systematic reviews among end-users) were triangulated. The typology has been useful in stakeholder interactions when defining evidence, and identifying processes for EIDM. There was a preference for defining evidence as descriptive evidence (data) rather than impact evidence and implementation evidence. Practitioners were confident and competent at generating and using data and community views descriptively for priority setting (describing the problem). However, finding and using impact and implementation evidence appropriate for strategy development (effective solutions) was often described as a more daunting task. As a result, there was low awareness of, and competence with, Types 2 and 3 evidence. Organizational processes for using these types of evidence were almost non-existent. Applying this typology with stakeholders has allowed us to observe that it; (1) has been useful in conceptualizing useful evidence for public health, which has guided our work (2) has been useful in stakeholder interactions to introduce evidence, its definition and what it means to be 'evidence-informed' and (3) has identified 'faults' in the EIDM approach. The typology includes examples of common questions in public health, and suggestions of the types of evidence that may be useful to answer those questions. Findings that test the use of the typology have been synthesized. These have demonstrated inconsistencies in defining and applying evidence, and low awareness about what types of evidence are crucial to ensure that interventions are effective and minimize harm. Based upon these findings, the authors would argue that current investment in type 1 evidence (e.g. data repositories) shifts to make way for KT strategies, which facilitate the uptake of type 2 and 3 evidence (interventions and implementation guidance). Building a shared understanding of the types of evidence and their importance in public health decision-making is crucial if we wish to build a system that supports EIDM and results in effective interventions being delivered. There are a number of 'faults' in the system which the authors have illuminated through understanding the individual and organizational realities of evidence use. These faults could be addressed through KT strategies with the public health workforce, and development of organizational cultures and the broader system. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
                Bookmark

                Author and article information

                Contributors
                Journal
                J Med Internet Res
                J. Med. Internet Res
                JMIR
                Journal of Medical Internet Research
                JMIR Publications (Toronto, Canada )
                1439-4456
                1438-8871
                September 2017
                28 September 2017
                : 19
                : 9
                : e286
                Affiliations
                [1] 1 National Collaborating Centre for Methods and Tools School of Nursing McMaster University Hamilton, ON Canada
                Author notes
                Corresponding Author: Maureen Dobbins dobbinsm@ 123456mcmaster.ca
                Author information
                http://orcid.org/0000-0002-5672-4898
                http://orcid.org/0000-0002-9420-2169
                http://orcid.org/0000-0002-1968-6765
                Article
                v19i9e286
                10.2196/jmir.6958
                5639207
                28958986
                7b2b1814-2c33-4d75-b90b-6b168b746bf4
                ©Linda Chan, Jeannie Mackintosh, Maureen Dobbins. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 28.09.2017.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.

                History
                : 8 November 2016
                : 3 January 2017
                : 8 March 2017
                : 28 April 2017
                Categories
                Original Paper
                Original Paper

                Medicine
                public health,public health practice,evidence-based practice,capacity building,continuing education,computer-assisted instruction

                Comments

                Comment on this article