4
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How to use and apply assessment tools in medical education?

      review-article
      Iberoamerican Journal of Medicine
      Hospital San Pedro
      Assessment, Methods, Medical Education

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Abstract Assessment in medical education usually gives the evidence that learning was carried out and the learning objectives were achieved. The assessment program is a measurement tool to evaluate the progress in knowledge, skills, behaviors, and the attitude of students. So, the planning for an effective assessment program should be based on instructional objectives, instructional activities, and efficient assessment methods. Thus, a well-designed assessment procedure should be characterized by validity and reliability. There are two methods for interpreting the results of students' performance, norm-referenced and criterion-referenced; the first gives a relative ranking of students while the second describes learning tasks that students can and cannot perform. The information that gets from the assessment results should be used effectively to evaluate and revise the instructional course for more improvement. Therefore, the reporting of the assessment results to stakeholders should be clear, comprehensive, and understandable to prevent misinterpretation that may affect students and other stakeholders adversely.

          Related collections

          Most cited references42

          • Record: found
          • Abstract: not found
          • Article: not found

          Assessment in medical education.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Validity: on meaningful interpretation of assessment data.

            All assessments in medical education require evidence of validity to be interpreted meaningfully. In contemporary usage, all validity is construct validity, which requires multiple sources of evidence; construct validity is the whole of validity, but has multiple facets. Five sources--content, response process, internal structure, relationship to other variables and consequences--are noted by the Standards for Educational and Psychological Testing as fruitful areas to seek validity evidence. The purpose of this article is to discuss construct validity in the context of medical education and to summarize, through example, some typical sources of validity evidence for a written and a performance examination. Assessments are not valid or invalid; rather, the scores or outcomes of assessments have more or less evidence to support (or refute) a specific interpretation (such as passing or failing a course). Validity is approached as hypothesis and uses theory, logic and the scientific method to collect and assemble data to support or fail to support the proposed score interpretations, at a given point in time. Data and logic are assembled into arguments--pro and con--for some specific interpretation of assessment data. Examples of types of validity evidence, data and information from each source are discussed in the context of a high-stakes written and performance examination in medical education. All assessments require evidence of the reasonableness of the proposed interpretation, as test data in education have little or no intrinsic meaning. The constructs purported to be measured by our assessments are important to students, faculty, administrators, patients and society and require solid scientific evidence of their meaning.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Measurement of the general competencies of the accreditation council for graduate medical education: a systematic review.

              To evaluate published evidence that the Accreditation Council for Graduate Medical Education's six general competencies can each be measured in a valid and reliable way. In March 2008, the authors conducted searches of Medline and ERIC using combinations of search terms "ACGME," "Accreditation Council for Graduate Medical Education," "core competencies," "general competencies," and the specific competencies "systems-based practice" (SBP) and "practice based learning and improvement (PBLI)." Included were all publications presenting new qualitative or quantitative data about specific assessment modalities related to the general competencies since 1999; opinion pieces, review articles, and reports of consensus conferences were excluded. The search yielded 127 articles, of which 56 met inclusion criteria. Articles were subdivided into four categories: (1) quantitative/psychometric evaluations, (2) preliminary studies, (3) studies of SBP and PBLI, and (4) surveys. Quantitative/psychometric studies of evaluation tools failed to develop measures reflecting the six competencies in a reliable or valid way. Few preliminary studies led to published quantitative data regarding reliability or validity. Only two published surveys met quality criteria. Studies of SBP and PBLI generally operationalized these competencies as properties of systems, not of individual trainees. The peer-reviewed literature provides no evidence that current measurement tools can assess the competencies independently of one another. Because further efforts are unlikely to be successful, the authors recommend using the competencies to guide and coordinate specific evaluation efforts, rather than attempting to develop instruments to measure the competencies directly.
                Bookmark

                Author and article information

                Journal
                ijm
                Iberoamerican Journal of Medicine
                Iberoam J Med
                Hospital San Pedro (Logroño, La Rioja, Spain )
                2695-5075
                2695-5075
                2020
                : 2
                : 4
                : 351-359
                Affiliations
                [1] Ismailia City orgnameCollege of Medicine, Suez Canal University orgdiv1Department of Forensic Medicine and Clinical Toxicology Egypt
                [2] Taif orgnameTaif University orgdiv1College of Medicine Saudi Arabia
                Article
                S2695-50752020000400015 S2695-5075(20)00200400015
                10.5281/zenodo.3978444
                a278af96-74eb-4aa9-b625-3e7eac1109d8

                This work is licensed under a Creative Commons Attribution 4.0 International License.

                History
                : 24 July 2020
                : 10 August 2020
                Page count
                Figures: 0, Tables: 0, Equations: 0, References: 42, Pages: 9
                Product

                SciELO Spain

                Categories
                Review

                Medical Education,Methods,Assessment
                Medical Education, Methods, Assessment

                Comments

                Comment on this article