4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The implementation and evaluation of an e-Learning training module for objective structured clinical examination raters in Canada

      brief-report

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Improving the reliability and consistency of objective structured clinical examination (OSCE) raters’ marking poses a continual challenge in medical education. The purpose of this study was to evaluate an e-Learning training module for OSCE raters who participated in the assessment of third-year medical students at the University of Ottawa, Canada. The effects of online training and those of traditional in-person (face-to-face) orientation were compared. Of the 90 physicians recruited as raters for this OSCE, 60 consented to participate (67.7%) in the study in March 2017. Of the 60 participants, 55 rated students during the OSCE, while the remaining 5 were back-up raters. The number of raters in the online training group was 41, while that in the traditional in-person training group was 19. Of those with prior OSCE experience (n= 18) who participated in the online group, 13 (68%) reported that they preferred this format to the in-person orientation. The total average time needed to complete the online module was 15 minutes. Furthermore, 89% of the participants felt the module provided clarity in the rater training process. There was no significant difference in the number of missing ratings based on the type of orientation that raters received. Our study indicates that online OSCE rater training is comparable to traditional face-to-face orientation.

          Related collections

          Most cited references4

          • Record: found
          • Abstract: found
          • Article: not found

          The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration.

          The organisation, administration and running of a successful OSCE programme need considerable knowledge, experience and planning. Different teams looking after various aspects of OSCE need to work collaboratively for an effective question bank development, examiner training and standardised patients' training. Quality assurance is an ongoing process taking place throughout the OSCE cycle. In order for the OSCE to generate reliable results it is essential to pay attention to each and every element of quality assurance, as poorly standardised patients, untrained examiners, poor quality questions and inappropriate scoring rubrics each will affect the reliability of the OSCE. The validity will also be influenced if the questions are not realistic and mapped against the learning outcomes of the teaching programme. This part of the Guide addresses all these important issues in order to help the reader setup and quality assure their new or existing OSCE programmes.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Rater training to support high-stakes simulation-based assessments.

            Competency-based assessment and an emphasis on obtaining higher-level outcomes that reflect physicians' ability to demonstrate their skills has created a need for more advanced assessment practices. Simulation-based assessments provide medical education planners with tools to better evaluate the 6 Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties (ABMS) core competencies by affording physicians opportunities to demonstrate their skills within a standardized and replicable testing environment, thus filling a gap in the current state of assessment for regulating the practice of medicine. Observational performance assessments derived from simulated clinical tasks and scenarios enable stronger inferences about the skill level a physician may possess, but also introduce the potential of rater errors into the assessment process. This article reviews the use of simulation-based assessments for certification, credentialing, initial licensure, and relicensing decisions and describes rater training strategies that may be used to reduce rater errors, increase rating accuracy, and enhance the validity of simulation-based observational performance assessments.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The overall impact of testing on medical student learning: quantitative estimation of consequential validity.

              Given medical education's longstanding emphasis on assessment, it seems prudent to evaluate whether our current research and development focus on testing makes sense. Since any intervention within medical education must ultimately be evaluated based upon its impact on student learning, this report seeks to provide a quantitative accounting of the learning gains attained through educational assessments. To approach this question, we estimate achieved learning within a medical school environment that optimally utilizes educational assessments. We compare this estimate to learning that might be expected in a medical school that employs no educational assessments. Effect sizes are used to estimate testing's total impact on learning by summarizing three effects; the direct effect, the indirect effect, and the selection effect. The literature is far from complete, but the available evidence strongly suggests that each of these effects is large and the net cumulative impact on learning in medical education is over two standard deviations. While additional evidence is required, the current literature shows that testing within medical education makes a strong positive contribution to learning.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                J Educ Eval Health Prof
                J Educ Eval Health Prof
                JEEHP
                Journal of Educational Evaluation for Health Professions
                Korea Health Personnel Licensing Examination Institute
                1975-5937
                2018
                6 August 2018
                : 15
                : 18
                Affiliations
                [1 ]Division of Hematology, The Ottawa Hospital, Ottawa, ON, Canada
                [2 ]Division of General Internal Medicine, The Ottawa Hospital, Ottawa, ON, Canada
                [3 ]Department of Family Medicine, University of Ottawa, Ottawa, ON, Canada
                Hallym University, Korea
                Author notes
                [* ]Corresponding email: kkhamisa@ 123456toh.ca
                Author information
                http://orcid.org/0000-0002-0557-2898
                http://orcid.org/0000-0002-5474-9696
                http://orcid.org/0000-0002-9182-0703
                http://orcid.org/0000-0002-7863-9387
                http://orcid.org/0000-0003-4076-9669
                Article
                jeehp-15-18
                10.3352/jeehp.2018.15.18
                6194479
                30078286
                bfb64e9d-fb35-4238-8f0f-d129489ef41d
                © 2018, Korea Health Personnel Licensing Examination Institute

                This is an open-access article distributed under the terms of the Creative Commons Attribution License < http://creativecommons.org/licenses/by/4.0/>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 3 May 2018
                : 6 August 2018
                Categories
                Brief Report

                Assessment, Evaluation & Research methods
                educational assessment,undergraduate medical education,clinical clerkship

                Comments

                Comment on this article