6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Using Modified Direct Observation of Procedural Skills (DOPS) to assess undergraduate medical students

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction:

          Nowadays according to competency based curriculum, selecting an appropriate assessment method is inevitable. This study aimed to investigate application of Direct Observation of Procedural Skills (DOPS) in undergraduate medical students.

          Methods:

          This is a cross sectional study conducted during emergency ward rotation in last year medical students using consensus sampling method. Each student performed 2 procedures at least twice under the observation of 2 assessors using modified DOPS rating scales designed for each procedure simultaneously. Correlation between DOPS score and final routine exam was measured. Face and content validity was determined by the panel of experts. Moreover, through the test-retest and inter-rater reliability, the correlation of each score and total score was investigated. The spent time was calculated too. The statistical analysis was carried out using SPSS version 18.

          Results:

          Totally 60 students did 240 procedures under DOPS. The face and content validity confirmed by an expert panel. The findings showed that there was a significant correlation between the scores of each test and the total DOPS score (r 1=0.736**, r 2=0.793**, r 3=0.564**, r 4=0.685**; p<0.001). There was a significant correlation between the first and second scores of doing the same procedure (Pearson Cor.=0.74, p<0.001) and also between the scores of the two individual examiners when observing the same procedure (Pearson Cor.=0.84-0.94 p<0.001). The results showed that there was no correlation (Pearson Correlation =0.018, p<0.89) between the scores of this test and the final routine ward exam scores. The average time for doing DOPS test and the average time for providing feedback were 11.17 Max and 9.2 4.5 Min, respectively.

          Conclusion:

          The use of novel performance assessment methods such as DOPS is highly beneficial in order to ensure the adequacy of learning in medical students and assess their readiness for accepting professional responsibilities. DOPS as a practical and reliable test with acceptable validation can be used to assess clinical skills of undergraduate medical students.

          Related collections

          Most cited references28

          • Record: found
          • Abstract: found
          • Article: not found

          Assessing professional competence: from methods to programmes.

          We use a utility model to illustrate that, firstly, selecting an assessment method involves context-dependent compromises, and secondly, that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects. In the model, assessment characteristics are differently weighted depending on the purpose and context of the assessment. Of the characteristics in the model, we focus on reliability, validity and educational impact and argue that they are not inherent qualities of any instrument. Reliability depends not on structuring or standardisation but on sampling. Key issues concerning validity are authenticity and integration of competencies. Assessment in medical education addresses complex competencies and thus requires quantitative and qualitative information from different sources as well as professional judgement. Adequate sampling across judges, instruments and contexts can ensure both validity and reliability. Despite recognition that assessment drives learning, this relationship has been little researched, possibly because of its strong context dependence. When assessment should stimulate learning and requires adequate sampling, in authentic contexts, of the performance of complex competencies that cannot be broken down into simple parts, we need to make a shift from individual methods to an integral programme, intertwined with the education programme. Therefore, we need an instructional design perspective. Programmatic instructional design hinges on a careful description and motivation of choices, whose effectiveness should be measured against the intended outcomes. We should not evaluate individual methods, but provide evidence of the utility of the assessment programme as a whole.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Workplace-based assessment as an educational tool: AMEE Guide No. 31.

            There has been concern that trainees are seldom observed, assessed, and given feedback during their workplace-based education. This has led to an increasing interest in a variety of formative assessment methods that require observation and offer the opportunity for feedback. To review some of the literature on the efficacy and prevalence of formative feedback, describe the common formative assessment methods, characterize the nature of feedback, examine the effect of faculty development on its quality, and summarize the challenges still faced. The research literature on formative assessment and feedback suggests that it is a powerful means for changing the behaviour of trainees. Several methods for assessing it have been developed and there is preliminary evidence of their reliability and validity. A variety of factors enhance the efficacy of workplace-based assessment including the provision of feedback that is consistent with the needs of the learner and focused on important aspects of the performance. Faculty plays a critical role and successful implementation requires that they receive training. There is a need for formative assessment which offers trainees the opportunity for feedback. Several good methods exist and feedback has been shown to have a major influence on learning. The critical role of faculty is highlighted, as is the need for strategies to enhance their participation and training.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The effect of assessments and examinations on the learning of medical students.

              This paper describes a situation where an alteration in the final-year assessment scheme led to changes in student learning activities which were the exact opposite of those intended. Students were seen to be spending a disproportionate amount of time studying the theoretical components of the course relative to the practical and clinical aspects. The paramount importance of the assessments and examinations in influencing student learning behaviour led the departments concerned to develop a new clinical examination which more clearly reflected the objectives of the course. A questionnaire survey was undertaken to determine how the different sections of the final assessment affected the students' approach to studying. The questionnaire was administered to graduates during their intern year for the 3 years following the introduction of the new clinical examination. Results were also obtained for the year preceding the change. The survey showed that the students developed a high regard for the new examination and its validity as a test of clinical competence. The students found that an increase in ward-based learning activities was essential for success in the final examinations. The new clinical examination has thus influenced students' learning and successfully restored the balance of their learning activities between the clinical and theoretical components of the course.
                Bookmark

                Author and article information

                Journal
                J Adv Med Educ Prof
                J Adv Med Educ Prof
                Journal of Advances in Medical Education & Professionalism
                Journal of Advances in Medical Education & Professionalism (Iran )
                2322-2220
                2322-3561
                July 2018
                : 6
                : 3
                : 130-136
                Affiliations
                [1 ]School of Medical Education, Shahid Beheshti University of Medical Sciences, Tehran, Iran
                [2 ]Clinical Education Research Center, Shiraz University of Medical Sciences, Shiraz, Iran
                [3 ]Department of Emergency Medicine, Mashhad University of Medical Sciences, Mashhad, Iran
                [4 ]Department of Community Medicine, Mashhad Branch, Islamic Azad University, Mashhad, Iran
                Author notes
                Correspondence:Mitra Amini, Clinical Education Research Center, Shiraz University of Medical Sciences, Shiraz, Iran. Tel:(+98)-713-2333064
                Article
                JAMP-6-3
                6039823
                30013997
                d6a4e2ac-7df3-4bfb-a49b-c37fa288e2a1
                Copyright: © Journal of Advances in Medical Education & Professionalism

                This is an open-access article distributed under the terms of the Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 3 June 2018
                : 10 November 2017
                Categories
                Original Article

                reliability , validity , feasibility , satisfaction, undergraduate, medical student

                Comments

                Comment on this article