33
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Evaluation of Multiple Choice and Short Essay Question items in Basic Medical Sciences

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Objectives: To evaluate Multiple Choice and Short Essay Question items in Basic Medical Sciences by determining item writing flaws (IWFs) of MCQs along with cognitive level of each item in both methods.

          Methods: This analytical study evaluated the quality of the assessment tools used for the first batch in a newly established medical college in Karachi, Pakistan. First and sixth module assessment tools in Biochemistry during 2009-2010 were analyzed. Cognitive level of MCQs and SEQs, were noted and MCQ item writing flaws were also evaluated.

          Results: A total of 36 SEQs and 150 MCQs of four items were analyzed. The cognitive level of 83.33% of SEQs was at recall level while remaining 16.67% were assessing interpretation of data. Seventy six percent of the MCQs were at recall level while remaining 24% were at the interpretation. Regarding IWFs, 69 IWFs were found in 150 MCQs. The commonest among them were implausible distracters (30.43%), unfocused stem (27.54%) and unnecessary information in the stem (24.64%).

          Conclusion: There is a need to review the quality including the content of assessment tools. A structured faculty development program is recommended for developing improved assessment tools that align with learning outcomes and measure competency of medical students.

          Related collections

          Most cited references31

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper

          Background Reliable and valid written tests of higher cognitive function are difficult to produce, particularly for the assessment of clinical problem solving. Modified Essay Questions (MEQs) are often used to assess these higher order abilities in preference to other forms of assessment, including multiple-choice questions (MCQs). MEQs often form a vital component of end-of-course assessments in higher education. It is not clear how effectively these questions assess higher order cognitive skills. This study was designed to assess the effectiveness of the MEQ to measure higher-order cognitive skills in an undergraduate institution. Methods An analysis of multiple-choice questions and modified essay questions (MEQs) used for summative assessment in a clinical undergraduate curriculum was undertaken. A total of 50 MCQs and 139 stages of MEQs were examined, which came from three exams run over two years. The effectiveness of the questions was determined by two assessors and was defined by the questions ability to measure higher cognitive skills, as determined by a modification of Bloom's taxonomy, and its quality as determined by the presence of item writing flaws. Results Over 50% of all of the MEQs tested factual recall. This was similar to the percentage of MCQs testing factual recall. The modified essay question failed in its role of consistently assessing higher cognitive skills whereas the MCQ frequently tested more than mere recall of knowledge. Conclusion Construction of MEQs, which will assess higher order cognitive skills cannot be assumed to be a simple task. Well-constructed MCQs should be considered a satisfactory replacement for MEQs if the MEQs cannot be designed to adequately test higher order skills. Such MCQs are capable of withstanding the intellectual and statistical scrutiny imposed by a high stakes exit examination.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The effects of violating standard item writing principles on tests and students: the consequences of using flawed test items on achievement examinations in medical education.

              The purpose of this research was to study the effects of violations of standard multiple-choice item writing principles on test characteristics, student scores, and pass-fail outcomes. Four basic science examinations, administered to year-one and year-two medical students, were randomly selected for study. Test items were classified as either standard or flawed by three independent raters, blinded to all item performance data. Flawed test questions violated one or more standard principles of effective item writing. Thirty-six to sixty-five percent of the items on the four tests were flawed. Flawed items were 0-15 percentage points more difficult than standard items measuring the same construct. Over all four examinations, 646 (53%) students passed the standard items while 575 (47%) passed the flawed items. The median passing rate difference between flawed and standard items was 3.5 percentage points, but ranged from -1 to 35 percentage points. Item flaws had little effect on test score reliability or other psychometric quality indices. Results showed that flawed multiple-choice test items, which violate well established and evidence-based principles of effective item writing, disadvantage some medical students. Item flaws introduce the systematic error of construct-irrelevant variance to assessments, thereby reducing the validity evidence for examinations and penalizing some examinees.
                Bookmark

                Author and article information

                Journal
                Pak J Med Sci
                Pak J Med Sci
                PJMS
                Pakistan Journal of Medical Sciences
                Professional Medical Publicaitons (Karachi, Pakistan )
                1682-024X
                1681-715X
                Jan-Feb 2014
                : 30
                : 1
                : 3-6
                Affiliations
                [1 ]Dr. Mukhtiar Baig, PhD, MHPE, Professor of Clinical Biochemistry, Head of Assessment Unit, Faculty of Medicine, Rabigh, King Abdulaziz University, Jeddah, Saudi Arabia.
                [2 ]Dr. Syeda Kauser Ali, PhD, Associate Professor, Department of Educational evelopment, Aga Khan University, Karachi, Pakistan.
                [3 ]Dr. Sobia Ali, MHPE, Assistant Professor, Department of Medical Education, Liaquat National Medical College, Karachi, Pakistan.
                [4 ]Ms. Nighat Huda, MS in Ed, Associate Professor, Medical Education Department Bahria University Medical & Dental College, Karachi, Pakistan.
                Author notes
                Correspondence: Dr. Mukhtiar Baig, Professor of Clinical Biochemistry Head of Assessment Unit, Faculty of Medicine, Rabigh, King Abdulaziz University, Jeddah, Saudi Arabia. E-mail: drmukhtiarbaig@yahoo.com
                Article
                10.12669/pjms.301.4458
                3955531
                24639820
                f0bc6ada-e9d6-4d46-b9f6-2798c3550ff3

                This is an Open Access article distributed under the terms of the Creative Commons Attribution License, ( http://creativecommons.org/licenses/by/3.0/) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 10 October 2013
                : 23 November 2013
                : 30 November 2013
                Categories
                Original Article

                assessment,item analysis,mcq,seq
                assessment, item analysis, mcq, seq

                Comments

                Comment on this article