6
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How to design and apply an Objective Structured Clinical Examination (OSCE) in medical education?

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Abstract The Objective Structured Clinical Examination (OSCE) is considered a gold standard summative and formative assessment method wherein it is a comprehensive and standardized tool assessing the clinical competencies including psychomotor domain, attitudes, and behaviors that will be manifested in the real work of the medical graduates. Therefore, the implementation of OSCE depends on the design of a blueprint that consists of two axes; the first axis is the tested competencies according to the learning objectives while the second axis represents a system or problem that is related to these competencies. Thus, the blueprint of OSCE is a translation for the learning objectives into clinical competences such as history taking, physical examination, radiographic and laboratory data interpretation, technical skills, attitudinal behaviors, and counseling skills. In addition, the utility index proved that OSCE has a good balance for acceptability, reliability, validity, credibility, feasibility, cost, and educational impact. However, the use of OSCE for the students' assessment is considered expensive and exhausted because it requires many facilities, a great deal of the personnel besides the needed consuming time for its application.

          Related collections

          Most cited references20

          • Record: found
          • Abstract: found
          • Article: not found

          Accuracy of physician self-assessment compared with observed measures of competence: a systematic review.

          Core physician activities of lifelong learning, continuing medical education credit, relicensure, specialty recertification, and clinical competence are linked to the abilities of physicians to assess their own learning needs and choose educational activities that meet these needs. To determine how accurately physicians self-assess compared with external observations of their competence. The electronic databases MEDLINE (1966-July 2006), EMBASE (1980-July 2006), CINAHL (1982-July 2006), PsycINFO (1967-July 2006), the Research and Development Resource Base in CME (1978-July 2006), and proprietary search engines were searched using terms related to self-directed learning, self-assessment, and self-reflection. Studies were included if they compared physicians' self-rated assessments with external observations, used quantifiable and replicable measures, included a study population of at least 50% practicing physicians, residents, or similar health professionals, and were conducted in the United Kingdom, Canada, United States, Australia, or New Zealand. Studies were excluded if they were comparisons of self-reports, studies of medical students, assessed physician beliefs about patient status, described the development of self-assessment measures, or were self-assessment programs of specialty societies. Studies conducted in the context of an educational or quality improvement intervention were included only if comparative data were obtained before the intervention. Study population, content area and self-assessment domain of the study, methods used to measure the self-assessment of study participants and those used to measure their competence or performance, existence and use of statistical tests, study outcomes, and explanatory comparative data were extracted. The search yielded 725 articles, of which 17 met all inclusion criteria. The studies included a wide range of domains, comparisons, measures, and methodological rigor. Of the 20 comparisons between self- and external assessment, 13 demonstrated little, no, or an inverse relationship and 7 demonstrated positive associations. A number of studies found the worst accuracy in self-assessment among physicians who were the least skilled and those who were the most confident. These results are consistent with those found in other professions. While suboptimal in quality, the preponderance of evidence suggests that physicians have a limited ability to accurately self-assess. The processes currently used to undertake professional development and evaluate competence may need to focus more on external assessment.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A systematic review of the reliability of objective structured clinical examination scores.

            The objective structured clinical examination (OSCE) is comprised of a series of simulations used to assess the skill of medical practitioners in the diagnosis and treatment of patients. It is often used in high-stakes examinations and therefore it is important to assess its reliability and validity. The published literature was searched (PsycINFO, PubMed) for OSCE reliability estimates (coefficient alpha and generalisability coefficients) computed either across stations or across items within stations. Coders independently recorded information about each study. A meta-analysis of the available literature was computed and sources of systematic variance in estimates were examined. A total of 188 alpha values from 39 studies were coded. The overall (summary) alpha across stations was 0.66 (95% confidence interval [CI] 0.62-0.70); the overall alpha within stations across items was 0.78 (95% CI 0.73-0.82). Better than average reliability was associated with a greater number of stations and a higher number of examiners per station. Interpersonal skills were evaluated less reliably across stations and more reliably within stations compared with clinical skills. Overall scores on the OSCE are often not very reliable. It is more difficult to reliably assess communication skills than clinical skills when considering both as general traits that should apply across situations. It is generally helpful to use two examiners and large numbers of stations, but some OSCEs appear more reliable than others for reasons that are not yet fully understood. © Blackwell Publishing Ltd 2011.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration.

              The organisation, administration and running of a successful OSCE programme need considerable knowledge, experience and planning. Different teams looking after various aspects of OSCE need to work collaboratively for an effective question bank development, examiner training and standardised patients' training. Quality assurance is an ongoing process taking place throughout the OSCE cycle. In order for the OSCE to generate reliable results it is essential to pay attention to each and every element of quality assurance, as poorly standardised patients, untrained examiners, poor quality questions and inappropriate scoring rubrics each will affect the reliability of the OSCE. The validity will also be influenced if the questions are not realistic and mapped against the learning outcomes of the teaching programme. This part of the Guide addresses all these important issues in order to help the reader setup and quality assure their new or existing OSCE programmes.
                Bookmark

                Author and article information

                Journal
                ijm
                Iberoamerican Journal of Medicine
                Iberoam J Med
                Hospital San Pedro (Logroño, La Rioja, Spain )
                2695-5075
                2695-5075
                2021
                : 3
                : 1
                : 51-55
                Affiliations
                [1] Ismailia City Suez orgnameSuez Canal University orgdiv1College of Medicine orgdiv2Department of Forensic Medicine and Clinical Toxicology Egypt
                [2] Taif orgnameTaif Universitya orgdiv1College of Medicine Arabia Saudita
                Article
                S2695-50752021000100009 S2695-5075(21)00300100009
                10.5281/zenodo.4247763
                ef7ed425-0c18-417b-953d-8314e9c8a46b

                This work is licensed under a Creative Commons Attribution 4.0 International License.

                History
                : 25 October 2020
                : 05 November 2020
                Page count
                Figures: 0, Tables: 0, Equations: 0, References: 20, Pages: 5
                Product

                SciELO Spain

                Categories
                Review

                Medical education,Clinical competence,Objective structured clinical examination,Educational assessment

                Comments

                Comment on this article