Blog
About

7
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Reliability of Operative Skill Evaluations: How Late Is Too Late to Give Feedback?

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          INTRODUCTION Evaluating resident operative skill acquisition is a common challenge for all surgical specialties. The Operative Entrustability Assessment (OEA) is a validated assessment tool that facilitates compliance with the Accreditation Council for Graduate Medical Education’s Next Accreditation System and documents resident operative performance at point-of-care. 1 The OEA and other operative rating tools have been implemented in surgical training programs across the United States in an effort to incorporate objective, valid, and reliable operative skills assessments into surgical training. 2 A recent multi-institutional qualitative study on resident feedback needs raised questions on the reliability of operative skills feedback when this feedback is given more than a week after the case date. 3 Using our experience with the OEA, we assessed the reliability of evaluation scores according to the timeliness of feedback completion. METHODS We extracted evaluator and self-assessment scores from all logged cases since OEA implementation at our institution. We defined OEA score reliability as the correlation between self-assessment and evaluator scores. This correlation has been shown in previous studies to be positive, moderate to strong, and statistically significant. 4,5 We used paired t test to compare scores and Pearson’s correlation coefficient to assess reliability stratifying by time-to-evaluation completion divided into quintiles (Q1: 0, Q2: 1–3, Q3: 4–13, Q4: 4–38, and Q5: > 38 days after surgery). We used likelihood ratio tests on linear regression to assess the interaction between reliability and timeliness to completion. RESULTS Between September 2013 and October 2016, 1778 complete OEAs were logged. Mean resident self-assessment (3.41 ± 1.09) was slightly higher than evaluator score (3.37 ± 0.99; P = 0.048). Overall, self-assessment score was significantly and strongly correlated with evaluator score [Pearson’s correlation coefficient (r) = 0.72; P < 0.001]. Stratified by delay-to-completion, correlation coefficients were roughly similar for evaluations completed within 0 days (r = 0.77; P < 0.001), 1–3 days (r = 0.73; P < 0.001), and 4–13 days after surgery (r = 0.70; P < 0.001). Although still statistically significant, this correlation was only moderately positive for evaluations entered within 14–38 days (r = 0.60; P < 0.001) or over 38 days (r = 0.52; P < 0.001) after surgery. We found strong evidence for an interaction between the time to completion of OEA scores and OEA evaluator score reliability (P < 0.001). CONCLUSIONS Our data support the reliability of OEA evaluator scores completed until 2 weeks from the case, with significantly decreased reliability associated with delayed completion. This represents a useful refinement in the interpretation of evaluation scores that is crucial as we move toward competency-based accreditation in surgical specialties. ACKNOWLEDGMENTS Michael Cohen assisted with data extraction for Operative Entrustability Assessment data. He was not compensated for this contribution. We thank the residents and faculty at the Johns Hopkins School of Medicine Department of Plastic and Reconstructive Surgery in making use of and continuously providing feedback on the Operative Entrustability Assessment.

          Related collections

          Most cited references 5

          • Record: found
          • Abstract: not found
          • Article: not found

          Teaching surgical skills--changes in the wind.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Resident self-assessment of operative performance.

            In medicine, the development of expertise requires the recognition of one's capabilities and limitations. This study aimed to verify the accuracy of self-assessment for the performance of a surgical task, and to determine whether self-assessment may be improved through self-observation or exposure to relevant standards of performance. Twenty-six senior surgical residents were videotaped performing a laparoscopic Nissen fundoplication in a pig. Experts rated the videos using two scoring systems. Subjects evaluated their performances after performance of the Nissen, after self-observation of their videotaped performance, and after review of four videotaped "benchmark" performances. Expert interrater reliability was 0.66 (intraclass correlation coefficient). The correlation between experts' and residents' self-evaluations was initially moderate (r = 0.50, P <0.01), increasing significantly after the residents reviewed their own videotaped performance to r = 0.63 (Deltar = 0.13, P <0.01), yet did not change after review of the benchmarks. Self-observation of videotaped performance improved the residents' ability to self-evaluate.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Comprehensive Observations of Resident Evolution: A Novel Method for Assessing Procedure-Based Residency Training.

              Assessment of surgical skills in the operating room remains a challenge. Increasing documentation requirements of the Accreditation Council for Graduate Medical Education are necessitating mechanisms to document trainee competence without hindering operative turnover. The authors created a comprehensive electronic resource to facilitate plastic surgery training program compliance with changes mandated by Next Accreditation System Milestones and the ACGME.
                Bookmark

                Author and article information

                Journal
                Plast Reconstr Surg Glob Open
                Plast Reconstr Surg Glob Open
                GOX
                Plastic and Reconstructive Surgery Global Open
                Wolters Kluwer Health
                2169-7574
                25 September 2017
                September 2017
                : 5
                : 9
                Affiliations
                From the Department of Plastic and Reconstructive Surgery, Johns Hopkins University School of Medicine, Baltimore, Md.
                Author notes
                Carisa M. Cooney, MPH, Department of Plastic and Reconstructive Surgery, Johns Hopkins University School of Medicine, 601 N. Caroline St., JHOC 8161, Baltimore, MD 21287, E-mail: ccooney3@ 123456jhmi.edu
                Article
                00049
                10.1097/GOX.0000000000001465
                5640342
                Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. on behalf of The American Society of Plastic Surgeons.

                This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

                Product
                Categories
                ACAPS Abstracts
                Custom metadata
                TRUE

                Comments

                Comment on this article