21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How does the medical graduates' self-assessment of their clinical competency differ from experts' assessment?

      research-article
      1 , , 1
      BMC Medical Education
      BioMed Central

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The assessment of the performance of medical school graduates during their first postgraduate years provides an early indicator of the quality of the undergraduate curriculum and educational process. The objective of this study was to assess the clinical competency of medical graduates, as perceived by the graduates themselves and by the experts.

          Methods

          This is a hospital based cross-sectional study. It covered 105 medical graduates and 63 experts selected by convenient sampling method. A self-administered questionnaire covering the different areas of clinical competency constructed on a five-point Likert scale was used for data collection. Data processing and analysis were performed using the Statistical Package for Social Science (SPSS) 16.0. The mean, frequency distribution, and percentage of the variables were calculated. A non-parametric Kruskal Wallis test was applied to verify whether the graduates' and experts' assessments were influenced by the graduates' variables such as age, gender, experience, type of hospital, specialty and location of work at a (p ≤ 0.05) level of significance.

          Results

          The overall mean scores for experts' and graduates' assessments were 3.40 and 3.63, respectively (p= 0.035). Almost 87% of the graduates perceived their competency as good and very good in comparison with only 67.7% by experts. Female and male graduates who rated themselves as very good were 33.8% and 25% respectively. More than 19% of the graduates in the age group > 30 years perceived their clinical competency as inadequate in contrast with only 6.2% of the graduates in the youngest age group. Experts rated 40% of the female graduates as inadequate versus 20% of males, (p= 0.04). More than 40% of the graduates in younger age group were rated by experts as inadequate, versus 9.7% of the higher age group >30 years (p = 0.03).

          Conclusion

          There was a wide discrepancy between the graduates' self-assessment and experts' assessment, particularly in the level of inadequate performance. Graduates in general, and those of younger age groups in particular, tend to overestimate their clinical skills and competency.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: not found

          Comparison between medical students' experience, confidence and competence.

          This study was undertaken to determine whether or not breadth of clinical experience and student levels of confidence were indicators of competency on standardized simulator performance-based assessments. All students (n=144) attending an educational session were asked to complete a 25-point questionnaire regarding specific clinical experiences and levels of confidence in their ability to manage patient problems. For enumeration of clinical experiences, students were asked to estimate the number of times a situation had been encountered or a skill had been performed. For level of confidence, each response was based on a 5-point Likert scale where 1=novice and 5=expert. Students then participated in a standardized simulated performance test. Median and range were calculated and data analysed using Spearman rank correlations. A P-value <0.05 was considered significant. Level of confidence data were compared to performance during clinical rotation and to marks in the anaesthesia final examination. A total of 144 students attended the session, completed the questionnaire and participated in the standardized test. There were wide ranges of experience and confidence in the 25 listed items. Analysis of data showed good correlation between clinical experience and level of confidence. There was no correlation between clinical experience, level of confidence and performance in a standardized simulation test. Neither was there any correlation between level of confidence and clinical grades or written examination marks. Clinical experience and level of confidence have no predictive value in performance assessments when using standardized anaesthesia simulation scenarios.
            • Record: found
            • Abstract: found
            • Article: not found

            Effects of training in direct observation of medical residents' clinical competence: a randomized trial.

            Faculty observation of residents and students performing clinical skills is essential for reliable and valid evaluation of trainees. To evaluate the efficacy of a new multifaceted method of faculty development called direct observation of competence training. Controlled trial of faculty from 16 internal medicine residency programs using a cluster randomization design. Academic medical centers. 40 internal medicine teaching faculty members: 17 in the intervention group and 23 in the control group. Changes in faculty comfort performing direct observation, faculty satisfaction with workshop, and changes in faculty rating behaviors 8 months after completing the training. The direct observation of competence workshop combines didactic mini-lectures, interactive small group and videotape evaluation exercises, and evaluation skill practice with standardized residents and patients. 37 faculty members (16 in the intervention group and 21 in the control group) completed the study. Most of the faculty in the intervention group (14 [88%]) reported that they felt significantly more comfortable performing direct observation compared with control group faculty (4 [19%]) (P = 0.04), and all intervention faculty rated the training as outstanding. For 9 videotaped clinical encounters, intervention group faculty were more stringent than controls in their evaluations of medical interviewing, physical examination, and counseling; differences in ratings for medical interviewing and physical examination remained statistically significant even after adjustment for baseline rating behavior. The study involved a limited number of residency programs, and faculty did not rate the performance of actual residents. Direct observation of competence training, a new multifaceted approach to faculty development, leads to meaningful changes in rating behaviors and in faculty comfort with evaluation of clinical skills.
              • Record: found
              • Abstract: found
              • Article: not found

              Measuring self-assessment: current state of the art.

              The competent physician pursues lifelong learning through the recognition of deficiencies and the formulation of appropriate learning goals. Despite the accepted theoretical value of self-assessment, studies have consistently shown that the accuracy of self-assessment is poor. This paper examines the methodological issues that plague the measurement of self-assessment ability and presents several strategies that address these methodological problems within the current paradigm. In addition, the article proposes an alternative conceptualization of self-assessment and describes its associated methods. The conclusions of prior research in this domain must be re-examined in light of the common pitfalls encountered in the design of the studies and the analyses of the data. Future efforts to elucidate self-assessment phenomena need to consider the implications of this review.

                Author and article information

                Contributors
                Journal
                BMC Med Educ
                BMC Med Educ
                BMC Medical Education
                BioMed Central
                1472-6920
                2013
                13 February 2013
                : 13
                : 24
                Affiliations
                [1 ]Community Medicine and Public Health Department, Faculty of Medicine and Health Sciences, University of Aden, Aden, Yemen
                Article
                1472-6920-13-24
                10.1186/1472-6920-13-24
                3576227
                23402221
                1aad4d8e-810d-4b2e-bc5c-79e91ee352a5
                Copyright ©2013 Abadel and Hattab; licensee BioMed Central Ltd.

                This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 25 September 2012
                : 7 February 2013
                Categories
                Research Article

                Education
                Education

                Comments

                Comment on this article

                Related Documents Log
                scite_

                Similar content31

                Cited by19

                Most referenced authors164