30
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Judging residents’ performance: a qualitative study using grounded theory

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Although program directors judge residents’ performance for summative decisions, little is known about how they do this. This study examined what information program directors use and how they value this information in making a judgment of residents’ performance and what residents think of this process.

          Methods

          Sixteen semi-structured interviews were held with residents and program directors from different hospitals in the Netherlands in 2015–2016. Participants were recruited from internal medicine, surgery and radiology. Transcripts were analysed using grounded theory methodology. Concepts and themes were identified by iterative constant comparison.

          Results

          When approaching semi-annual meetings with residents, program directors report primarily gathering information from the following: assessment tools, faculty members and from their own experience with residents. They put more value on faculty’s comments during meetings and in the corridors than on feedback provided in the assessment tools. They are influenced by their own beliefs about learning and education in valuing feedback. Residents are aware that faculty members discuss their performance in meetings, but they believe the assessment tools provide the most important proof to demonstrate their clinical competency.

          Conclusions

          Residents think that feedback in the assessment tools is the most important proof to demonstrate their performance, whereas program directors scarcely use this feedback to form a judgment about residents’ performance. They rely heavily on remarks of faculty in meetings instead. Therefore, residents’ performance may be better judged in group meetings that are organised to enhance optimal information sharing and decision making about residents’ performance.

          Electronic supplementary material

          The online version of this article (10.1186/s12909-018-1446-1) contains supplementary material, which is available to authorized users.

          Related collections

          Most cited references35

          • Record: found
          • Abstract: found
          • Article: not found

          Assessing professional competence: from methods to programmes.

          We use a utility model to illustrate that, firstly, selecting an assessment method involves context-dependent compromises, and secondly, that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects. In the model, assessment characteristics are differently weighted depending on the purpose and context of the assessment. Of the characteristics in the model, we focus on reliability, validity and educational impact and argue that they are not inherent qualities of any instrument. Reliability depends not on structuring or standardisation but on sampling. Key issues concerning validity are authenticity and integration of competencies. Assessment in medical education addresses complex competencies and thus requires quantitative and qualitative information from different sources as well as professional judgement. Adequate sampling across judges, instruments and contexts can ensure both validity and reliability. Despite recognition that assessment drives learning, this relationship has been little researched, possibly because of its strong context dependence. When assessment should stimulate learning and requires adequate sampling, in authentic contexts, of the performance of complex competencies that cannot be broken down into simple parts, we need to make a shift from individual methods to an integral programme, intertwined with the education programme. Therefore, we need an instructional design perspective. Programmatic instructional design hinges on a careful description and motivation of choices, whose effectiveness should be measured against the intended outcomes. We should not evaluate individual methods, but provide evidence of the utility of the assessment programme as a whole.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Group performance and decision making.

            Theory and research on small group performance and decision making is reviewed. Recent trends in group performance research have found that process gains as well as losses are possible, and both are frequently explained by situational and procedural contexts that differentially affect motivation and resource coordination. Research has continued on classic topics (e.g., brainstorming, group goal setting, stress, and group performance) and relatively new areas (e.g., collective induction). Group decision making research has focused on preference combination for continuous response distributions and group information processing. New approaches (e.g., group-level signal detection) and traditional topics (e.g., groupthink) are discussed. New directions, such as nonlinear dynamic systems, evolutionary adaptation, and technological advances, should keep small group research vigorous well into the future.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Programmatic assessment of competency-based workplace learning: when theory meets practice

              Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.
                Bookmark

                Author and article information

                Contributors
                +31648495035 , Marloes.Duitsman@radboudumc.nl
                Lia.Fluit@radboudumc.nl
                W.vanderGoot@mzh.nl
                m.tenkate-booij@erasmusmc.nl
                Jacqueline.DeGraaf@radboudumc.nl
                a.d.c.jaarsma@umcg.nl
                Journal
                BMC Med Educ
                BMC Med Educ
                BMC Medical Education
                BioMed Central (London )
                1472-6920
                8 January 2019
                8 January 2019
                2019
                : 19
                : 13
                Affiliations
                [1 ]ISNI 0000 0004 0444 9382, GRID grid.10417.33, Department of Internal Medicine and Health Academy, Radboud Health Academy, , Radboud University Medical Centre, ; Gerard van Swietenlaan 4, Postbus 9101, 6500 HB Nijmegen, the Netherlands
                [2 ]ISNI 0000 0004 0444 9382, GRID grid.10417.33, Health Academy, Department of Research in Learning and Education, , Radboud University Medical Centre, ; Nijmegen, the Netherlands
                [3 ]ISNI 0000 0004 0631 9063, GRID grid.416468.9, Martini Hospital, ; Groningen, the Netherlands
                [4 ]ISNI 000000040459992X, GRID grid.5645.2, Department of Obstetrics and Gynaecology, , Erasmus University Medical Centre, ; Rotterdam, the Netherlands
                [5 ]ISNI 0000 0004 0444 9382, GRID grid.10417.33, Department of Internal Medicine, , Radboudumc Nijmegen, ; Nijmegen, the Netherlands
                [6 ]ISNI 0000 0000 9558 4598, GRID grid.4494.d, Centre for Education Development and Research in Health Professions, , University Medical Centre Groningen, ; Groningen, the Netherlands
                Author information
                http://orcid.org/0000-0002-8496-4098
                Article
                1446
                10.1186/s12909-018-1446-1
                6325830
                30621674
                0b6a3604-30fc-4eb0-a469-4751193a2ddb
                © The Author(s). 2019

                Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

                History
                : 7 March 2018
                : 28 December 2018
                Funding
                Funded by: Dutch Federation of Medical Specialists
                Award ID: none
                Categories
                Research Article
                Custom metadata
                © The Author(s) 2019

                Education
                assessment,postgraduate medical education,program directors,resident’s performance,grounded theory

                Comments

                Comment on this article