447
views
0
recommends
+1 Recommend
0 collections
    40
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Curated Collections for Educators: Five Key Papers about Program Evaluation

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The evaluation of educational programs has become an expected part of medical education. At some point, all medical educators will need to critically evaluate the programs that they deliver. However, the evaluation of educational programs requires a very different skillset than teaching. In this article, we aim to identify and summarize key papers that would be helpful for faculty members interested in exploring program evaluation.

          In November of 2016, the 2015-2016 Academic life in emergency medicine (ALiEM) Faculty Incubator program highlighted key papers in a discussion of program evaluation. This list of papers was augmented with suggestions by guest experts and by an open call on Twitter. This resulted in a list of 30 papers on program evaluation. Our authorship group then engaged in a process akin to a Delphi study to build consensus on the most important papers about program evaluation for medical education faculty.

          We present our group’s top five most highly rated papers on program evaluation. We also summarize these papers with respect to their relevance to junior medical education faculty members and faculty developers.

          Program evaluation is challenging. The described papers will be informative for junior faculty members as they aim to design literature-informed evaluations for their educational programs.

          Related collections

          Most cited references48

          • Record: found
          • Abstract: found
          • Article: not found

          Research guidelines for the Delphi survey technique.

          Consensus methods such as the Delphi survey technique are being employed to help enhance effective decision-making in health and social care. The Delphi survey is a group facilitation technique, which is an iterative multistage process, designed to transform opinion into group consensus. It is a flexible approach, that is used commonly within the health and social sciences, yet little guidance exists to help researchers undertake this method of data collection. This paper aims to provide an understanding of the preparation, action steps and difficulties that are inherent within the Delphi. Used systematically and rigorously, the Delphi can contribute significantly to broadening knowledge within the nursing profession. However, careful thought must be given before using the method; there are key issues surrounding problem identification, researcher skills and data presentation that must be addressed. The paper does not claim to be definitive; it purports to act as a guide for those researchers who wish to exploit the Delphi methodology.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Student Evaluations of Teaching (Mostly) Do Not Measure Teaching Effectiveness

            Student evaluations of teaching (SET) are widely used in academic personnel decisions as a measure of teaching effectiveness. We show: SET are biased against female instructors by an amount that is large and statistically significant the bias affects how students rate even putatively objective aspects of teaching, such as how promptly assignments are graded the bias varies by discipline and by student gender, among other things it is not possible to adjust for the bias, because it depends on so many factors SET are more sensitive to students' gender bias and grade expectations than they are to teaching effectiveness gender biases can be large enough to cause more effective instructors to get lower SET than less effective instructors. These findings are based on nonparametric statistical tests applied to two datasets: 23,001 SET of 379 instructors by 4,423 students in six mandatory first-year courses in a five-year natural experiment at a French university, and 43 SET for four sections of an online course in a randomized, controlled, blind experiment at a US university.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Half a minute: Predicting teacher evaluations from thin slices of nonverbal behavior and physical attractiveness.

                Bookmark

                Author and article information

                Journal
                Cureus
                Cureus
                2168-8184
                Cureus
                Cureus (Palo Alto (CA) )
                2168-8184
                4 May 2017
                May 2017
                : 9
                : 5
                : e1224
                Affiliations
                [1 ] Department of Emergency Medicine, College of Medicine, University of Saskatchewan
                [2 ] Department of Emergency Medicine, Rush University Medical Center
                [3 ] Emergency Medicine, University of California at Irvine
                [4 ] Emergency Medicine, The Ohio State University Wexner Medical Center
                [5 ] Department of Emergency Medicine, SUNY Downstate College of Medicine
                [6 ] Department of Emergency Medicine, University of Illinois College of Medicine At Peoria
                [7 ] Department of Emergency Medicine, Universidad San Sebastián
                [8 ] Department of Emergency Medicine, Oregon Health & Science University
                [9 ] Faculty of Health Sciences, Division of Emergency Medicine, McMaster University
                Author notes
                Article
                10.7759/cureus.1224
                5453746
                e40c08b0-9adc-4dff-8eec-c0a70d85e628
                Copyright © 2017, Thoma et al.

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 25 March 2017
                : 4 May 2017
                Funding
                Drs. Michael Gottlieb, Megan Boysen-Osborn, and Teresa M Chan report receiving teaching honoraria from Academic Life in Emergency Medicine (ALiEM) during the conduct of the study for their participation as mentors for the 2016-17 ALiEM Faculty Incubator.
                Categories
                Medical Education

                program evaluation,medical education,curated collection

                Comments

                Comment on this article