21
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Deconstructing programmatic assessment

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We describe programmatic assessment and the problems it might solve in relation to assessment and learning, identify some models implemented internationally, and then outline what we believe are programmatic assessment’s key components and what these components might achieve. We then outline some issues around implementation, which include blueprinting, data collection, decision making, staff support, and evaluation. Rather than adopting an all-or-nothing approach, we suggest that elements of programmatic assessment can be gradually introduced into traditional assessment systems.

          Related collections

          Most cited references25

          • Record: found
          • Abstract: found
          • Article: not found

          Assessing professional competence: from methods to programmes.

          We use a utility model to illustrate that, firstly, selecting an assessment method involves context-dependent compromises, and secondly, that assessment is not a measurement problem but an instructional design problem, comprising educational, implementation and resource aspects. In the model, assessment characteristics are differently weighted depending on the purpose and context of the assessment. Of the characteristics in the model, we focus on reliability, validity and educational impact and argue that they are not inherent qualities of any instrument. Reliability depends not on structuring or standardisation but on sampling. Key issues concerning validity are authenticity and integration of competencies. Assessment in medical education addresses complex competencies and thus requires quantitative and qualitative information from different sources as well as professional judgement. Adequate sampling across judges, instruments and contexts can ensure both validity and reliability. Despite recognition that assessment drives learning, this relationship has been little researched, possibly because of its strong context dependence. When assessment should stimulate learning and requires adequate sampling, in authentic contexts, of the performance of complex competencies that cannot be broken down into simple parts, we need to make a shift from individual methods to an integral programme, intertwined with the education programme. Therefore, we need an instructional design perspective. Programmatic instructional design hinges on a careful description and motivation of choices, whose effectiveness should be measured against the intended outcomes. We should not evaluate individual methods, but provide evidence of the utility of the assessment programme as a whole.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Programmatic assessment of competency-based workplace learning: when theory meets practice

            Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The impact of programmatic assessment on student learning: theory versus practice.

              It is widely acknowledged that assessment can affect student learning. In recent years, attention has been called to 'programmatic assessment', which is intended to optimise both learning functions and decision functions at the programme level of assessment, rather than according to individual methods of assessment. Although the concept is attractive, little research into its intended effects on students and their learning has been conducted.
                Bookmark

                Author and article information

                Journal
                Adv Med Educ Pract
                Adv Med Educ Pract
                Advances in Medical Education and Practice
                Advances in Medical Education and Practice
                Dove Medical Press
                1179-7258
                2018
                22 March 2018
                : 9
                : 191-197
                Affiliations
                [1 ]Education Unit, University of Otago, Christchurch, New Zealand
                [2 ]Education Unit, University of Otago, Wellington, New Zealand
                Author notes
                Correspondence: Tim J Wilkinson, Education Unit, University of Otago, Christchurch, PO Box 4345, Christchurch, New Zealand 8140, Tel +64 3 364 0530, Email tim.wilkinson@ 123456otago.ac.nz
                Article
                amep-9-191
                10.2147/AMEP.S144449
                5868629
                29606896
                95ed6193-e327-4c61-bf17-473f0ea2a42b
                © 2018 Wilkinson and Tweed. This work is published and licensed by Dove Medical Press Limited

                The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution – Non Commercial (unported, v3.0) License ( http://creativecommons.org/licenses/by-nc/3.0/). By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed.

                History
                Categories
                Perspectives

                assessment,implementation,medicine,decision making
                assessment, implementation, medicine, decision making

                Comments

                Comment on this article