2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Learning Analytics in Medical Education Assessment: The Past, the Present, and the Future

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          <p id="d7608534e335">With the implementation of competency‐based medical education ( <span style="fixed-case">CBME</span>) in emergency medicine, residency programs will amass substantial amounts of qualitative and quantitative data about trainees’ performances. This increased volume of data will challenge traditional processes for assessing trainees and remediating training deficiencies. At the intersection of trainee performance data and statistical modeling lies the field of medical learning analytics. At a local training program level, learning analytics has the potential to assist program directors and competency committees with interpreting assessment data to inform decision making. On a broader level, learning analytics can be used to explore system questions and identify problems that may impact our educational programs. Scholars outside of health professions education have been exploring the use of learning analytics for years and their theories and applications have the potential to inform our implementation of <span style="fixed-case">CBME</span>. The purpose of this review is to characterize the methodologies of learning analytics and explore their potential to guide new forms of assessment within medical education. </p>

          Related collections

          Most cited references 41

          • Record: found
          • Abstract: found
          • Article: not found

          The mini-CEX: a method for assessing clinical skills.

          To evaluate the mini-clinical evaluation exercise (mini-CEX), which assesses the clinical skills of residents. Observational study and psychometric assessment of the mini-CEX. 21 internal medicine training programs. Data from 1228 mini-CEX encounters involving 421 residents and 316 evaluators. The encounters were assessed for the type of visit, sex and complexity of the patient, when the encounter occurred, length of the encounter, ratings provided, and the satisfaction of the examiners. Using this information, we determined the overall average ratings for residents in all categories, the reliability of the mini-CEX scores, and the effects of the characteristics of the patients and encounters. Interviewing skills, physical examination, professionalism, clinical judgment, counseling, organization and efficiency, and overall competence were evaluated. Residents were assessed in various clinical settings with a diverse set of patient problems. Residents received the lowest ratings in the physical examination and the highest ratings in professionalism. Comparisons over the first year of training showed statistically significant improvement in all aspects of competence, and the method generated reliable ratings. The measurement characteristics of the mini-CEX are similar to those of other performance assessments, such as standardized patients. Unlike these assessments, the difficulty of the examination will vary with the patients that a resident encounters. This effect is mitigated to a degree by the examiners, who slightly overcompensate for patient difficulty, and by the fact that each resident interacts with several patients. Furthermore, the mini-CEX has higher fidelity than these formats, permits evaluation based on a much broader set of clinical settings and patient problems, and is administered on site.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Learning analytics: drivers, developments and challenges

              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Programmatic assessment of competency-based workplace learning: when theory meets practice

              Background In competency-based medical education emphasis has shifted towards outcomes, capabilities, and learner-centeredness. Together with a focus on sustained evidence of professional competence this calls for new methods of teaching and assessment. Recently, medical educators advocated the use of a holistic, programmatic approach towards assessment. Besides maximum facilitation of learning it should improve the validity and reliability of measurements and documentation of competence development. We explored how, in a competency-based curriculum, current theories on programmatic assessment interacted with educational practice. Methods In a development study including evaluation, we investigated the implementation of a theory-based programme of assessment. Between April 2011 and May 2012 quantitative evaluation data were collected and used to guide group interviews that explored the experiences of students and clinical supervisors with the assessment programme. We coded the transcripts and emerging topics were organised into a list of lessons learned. Results The programme mainly focuses on the integration of learning and assessment by motivating and supporting students to seek and accumulate feedback. The assessment instruments were aligned to cover predefined competencies to enable aggregation of information in a structured and meaningful way. Assessments that were designed as formative learning experiences were increasingly perceived as summative by students. Peer feedback was experienced as a valuable method for formative feedback. Social interaction and external guidance seemed to be of crucial importance to scaffold self-directed learning. Aggregating data from individual assessments into a holistic portfolio judgement required expertise and extensive training and supervision of judges. Conclusions A programme of assessment with low-stakes assessments providing simultaneously formative feedback and input for summative decisions proved not easy to implement. Careful preparation and guidance of the implementation process was crucial. Assessment for learning requires meaningful feedback with each assessment. Special attention should be paid to the quality of feedback at individual assessment moments. Comprehensive attention for faculty development and training for students is essential for the successful implementation of an assessment programme.
                Bookmark

                Author and article information

                Journal
                AEM Education and Training
                AEM Education and Training
                Wiley
                24725390
                April 2018
                April 2018
                March 22 2018
                : 2
                : 2
                : 178-187
                Affiliations
                [1 ]McMaster program for Education Research, Innovation, and Theory (MERIT); Hamilton Ontario Canada
                [2 ]Centre for Education Research & Innovation; Schulich School of Medicine and Dentistry; Saskatoon Saskatchewan Canada
                [3 ]Department of Emergency Medicine; University of Saskatchewan; Saskatoon Saskatchewan Canada
                [4 ]Steinhardt School of Culture, Education, and Human Development; New York University; New York NY
                [5 ]Faculty of Health Science; Division of Emergency Medicine; Department of Medicine; McMaster University; Hamilton Ontario Canada
                [6 ]Department of Emergency Medicine; NYU School of Medicine; New York NY
                Article
                10.1002/aet2.10087
                6001721
                30051086
                © 2018

                Comments

                Comment on this article