13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Between trust and control: Teachers' assessment conceptualisations within programmatic assessment

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objectives

          Programmatic assessment attempts to facilitate learning through individual assessments designed to be of low‐stakes and used only for high‐stake decisions when aggregated. In practice, low‐stake assessments have yet to reach their potential as catalysts for learning. We explored how teachers conceptualise assessments within programmatic assessment and how they engage with learners in assessment relationships.

          Methods

          We used a constructivist grounded theory approach to explore teachers' assessment conceptualisations and assessment relationships in the context of programmatic assessment. We conducted 23 semi‐structured interviews at two different graduate‐entry medical training programmes following a purposeful sampling approach. Data collection and analysis were conducted iteratively until we reached theoretical sufficiency. We identified themes using a process of constant comparison.

          Results

          Results showed that teachers conceptualise low‐stake assessments in three different ways: to stimulate and facilitate learning; to prepare learners for the next step, and to use as feedback to gauge the teacher's own effectiveness. Teachers intended to engage in and preserve safe, yet professional and productive working relationships with learners to enable assessment for learning when securing high‐quality performance and achievement of standards. When teachers' assessment conceptualisations were more focused on accounting conceptions, this risked creating tension in the teacher‐learner assessment relationship. Teachers struggled between taking control and allowing learners' independence.

          Conclusions

          Teachers believe programmatic assessment can have a positive impact on both teaching and student learning. However, teachers' conceptualisations of low‐stake assessments are not focused solely on learning and also involve stakes for teachers. Sampling across different assessments and the introduction of progress committees were identified as important design features to support teachers and preserve the benefits of prolonged engagement in assessment relationships. These insights contribute to the design of effective implementations of programmatic assessment within the medical education context.

          Abstract

          Teachers conceptualise programmatic assessment in varied ways, as shown by Schut et al., creating tensions in teacher‐learner assessment relationships.

          Related collections

          Most cited references29

          • Record: found
          • Abstract: found
          • Article: not found

          Assessing professional competence: from methods to programmes.

          We use a utility model to illustrate that, firstly, selecting an assessment method involves context-dependent compromises, and secondly, that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects. In the model, assessment characteristics are differently weighted depending on the purpose and context of the assessment. Of the characteristics in the model, we focus on reliability, validity and educational impact and argue that they are not inherent qualities of any instrument. Reliability depends not on structuring or standardisation but on sampling. Key issues concerning validity are authenticity and integration of competencies. Assessment in medical education addresses complex competencies and thus requires quantitative and qualitative information from different sources as well as professional judgement. Adequate sampling across judges, instruments and contexts can ensure both validity and reliability. Despite recognition that assessment drives learning, this relationship has been little researched, possibly because of its strong context dependence. When assessment should stimulate learning and requires adequate sampling, in authentic contexts, of the performance of complex competencies that cannot be broken down into simple parts, we need to make a shift from individual methods to an integral programme, intertwined with the education programme. Therefore, we need an instructional design perspective. Programmatic instructional design hinges on a careful description and motivation of choices, whose effectiveness should be measured against the intended outcomes. We should not evaluate individual methods, but provide evidence of the utility of the assessment programme as a whole.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Programmatic assessment of competency-based workplace learning: when theory meets practice

            Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Assessment, feedback and the alchemy of learning

              Models of sound assessment practices increasingly emphasise assessment's formative role. As a result, assessment must not only support sound judgements about learner competence, but also generate meaningful feedback to guide learning. Reconciling the tension between assessment's focus on judgement and decision making and feedback's focus on growth and development represents a critical challenge for researchers and educators.
                Bookmark

                Author and article information

                Contributors
                s.schut@maastrichtuniversity.nl
                Journal
                Med Educ
                Med Educ
                10.1111/(ISSN)1365-2923
                MEDU
                Medical Education
                John Wiley and Sons Inc. (Hoboken )
                0308-0110
                1365-2923
                06 April 2020
                June 2020
                : 54
                : 6 ( doiID: 10.1111/medu.v54.6 )
                : 528-537
                Affiliations
                [ 1 ] Department of Educational Development and Research Faculty of Health, Medicine and Life Sciences School of Health Professions Education Maastricht University Maastricht the Netherlands
                [ 2 ] Department of Pathology Cardiovascular Research Institute Maastricht Faculty of Health, Medicine and Life Sciences Maastricht University Maastricht the Netherlands
                [ 3 ] Education Institute Cleveland Clinic Lerner College of Medicine, Case Western Reserve University Cleveland Ohio USA
                [ 4 ] Department of Education Utrecht University Utrecht the Netherlands
                Author notes
                [*] [* ] Correspondence

                Suzanne Schut, Department of Educational Development and Research, Maastricht University, Universiteitssingel 60, 6229 ER Maastricht, the Netherlands.

                Email: s.schut@ 123456maastrichtuniversity.nl

                Author information
                https://orcid.org/0000-0002-8298-399X
                https://orcid.org/0000-0002-6103-8075
                https://orcid.org/0000-0002-7952-8822
                https://orcid.org/0000-0001-8115-261X
                https://orcid.org/0000-0001-6804-4163
                https://orcid.org/0000-0001-6802-3119
                Article
                MEDU14075
                10.1111/medu.14075
                7318263
                31998987
                e217384d-0029-4937-99bd-d2481a24c322
                © 2020 The Authors. Medical Education published by Association for the Study of Medical Education and John Wiley & Sons Ltd

                This is an open access article under the terms of the http://creativecommons.org/licenses/by-nc/4.0/ License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.

                History
                : 09 September 2019
                : 11 January 2020
                : 20 January 2020
                Page count
                Figures: 0, Tables: 2, Pages: 10, Words: 7144
                Categories
                Original Research
                Assessment
                Custom metadata
                2.0
                June 2020
                Converter:WILEY_ML3GV2_TO_JATSPMC version:5.8.4 mode:remove_FC converted:26.06.2020

                Education
                Education

                Comments

                Comment on this article