4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      University Admission Test Associates with Academic Performance at the End of Medical Course in a PBL Medical Hybrid Curriculum

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Purpose

          Most studies assessing the value of the university admissions test (UAT) to predict academic performance at the end of a medical course were carried out on lecture-based medical courses. However, the association between performance in the UAT with academic achievement at the end of medical course in a problem-based learning (PBL) medical hybrid curriculum remains controversial. The aim of this study was to correlate marks in the UAT with those obtained in the Organized Structured Clinical Examination (OSCE), in the progress testing (PT), and in the final marks of the clerkship (FMC).

          Methods

          We used data from 48 medical students. A single and a multiple dependency studies were performed to assess bivariate and multiple correlation between the UAT or the essay scores (dependent variables) and the OSCE, PT, and FMC (independent variables). Pearson test, multiple linear regression, and ANOVA tests were used and a p-value < 0.05 was considered significant.

          Results

          In the bivariate analysis, only the UAT and FMC marks were correlated (r=0.34; p=0.02). However, the multiple dependency study showed a moderate correlation among UAT, OSCE, PT, and FMC marks (r=0.46; p=0.01). No correlation was found between the essay scores and PT, FMC, and OSCE scores.

          Conclusion

          Our study shows that UAT marks, but not essay scores, can predict academic achievement, particularly in terms of clinical competence (FMC) at the end of a medical course in a PBL hybrid curriculum.

          Related collections

          Most cited references30

          • Record: found
          • Abstract: found
          • Article: not found

          How effective are selection methods in medical education? A systematic review.

          Selection methods used by medical schools should reliably identify whether candidates are likely to be successful in medical training and ultimately become competent clinicians. However, there is little consensus regarding methods that reliably evaluate non-academic attributes, and longitudinal studies examining predictors of success after qualification are insufficient. This systematic review synthesises the extant research evidence on the relative strengths of various selection methods. We offer a research agenda and identify key considerations to inform policy and practice in the next 50 years.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            An admissions OSCE: the multiple mini-interview.

            Although health sciences programmes continue to value non-cognitive variables such as interpersonal skills and professionalism, it is not clear that current admissions tools like the personal interview are capable of assessing ability in these domains. Hypothesising that many of the problems with the personal interview might be explained, at least in part, by it being yet another measurement tool that is plagued by context specificity, we have attempted to develop a multiple sample approach to the personal interview. A group of 117 applicants to the undergraduate MD programme at McMaster University participated in a multiple mini-interview (MMI), consisting of 10 short objective structured clinical examination (OSCE)-style stations, in which they were presented with scenarios that required them to discuss a health-related issue (e.g. the use of placebos) with an interviewer, interact with a standardised confederate while an examiner observed the interpersonal skills displayed, or answer traditional interview questions. The reliability of the MMI was observed to be 0.65. Furthermore, the hypothesis that context specificity might reduce the validity of traditional interviews was supported by the finding that the variance component attributable to candidate-station interaction was greater than that attributable to candidate. Both applicants and examiners were positive about the experience and the potential for this protocol. The principles used in developing this new admissions instrument, the flexibility inherent in the multiple mini-interview, and its feasibility and cost-effectiveness are discussed.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Validity of the Medical College Admission Test for predicting medical school performance.

              E. JULIAN (2005)
              Since the introduction of the revised Medical College Admission Test (MCAT(R)) in 1991, the Association of American Medical Colleges has been investigating the extent to which MCAT scores supplement the power of undergraduate grade point averages (uGPAs) to predict success in medical school. This report is a comprehensive summary of the relationships between MCAT scores and (1) medical school grades, (2) United States Medical Licensing Examination (USMLE) Step scores, and (3) academic distinction or difficulty. This study followed two cohorts from entrance to medical school through residency. Students from 14 medical schools' 1992 and 1993 entering classes provided data for predicting medical school grades and academic difficulty/distinction, while their peers from all of the U.S. medical schools were used to predict performance on USMLE Steps 1, 2, and 3. Regression analyses assessed the predictive power of combinations of uGPAs, MCAT scores, and undergraduate-institution selectivity. Grades were best predicted by a combination of MCAT scores and uGPAs, with MCAT scores providing a substantial increment over uGPAs. MCAT scores were better predictors of USMLE Step scores than were uGPAs, and the combination did little better than MCAT scores alone. The probability of experiencing academic difficulty or distinction tended to vary with MCAT scores. MCAT scores were strong predictors of scores for all three Step examinations, particularly Step 1. MCAT scores almost double the proportion of variance in medical school grades explained by uGPAs, and essentially replace the need for uGPAs in their impressive prediction of Step scores. The MCAT performs well as an indicator of academic preparation for medical school, independent of the school-specific handicaps of uGPAs.
                Bookmark

                Author and article information

                Journal
                Adv Med Educ Pract
                Adv Med Educ Pract
                amep
                amep
                Advances in Medical Education and Practice
                Dove
                1179-7258
                25 August 2020
                2020
                : 11
                : 579-585
                Affiliations
                [1 ]Department of Medicine, University of Ribeirão Preto , Ribeirão Preto City, Brazil
                Author notes
                Correspondence: Reinaldo B Bestetti University of Ribeirão Preto , Avenida Costábile Romano, 2201, Ribeirão Preto City14096-900, BrazilTel +55 16 36036795 Email rbestetti44@gmail.com
                Author information
                http://orcid.org/0000-0002-4488-9601
                http://orcid.org/0000-0002-9549-7739
                http://orcid.org/0000-0001-7571-1067
                http://orcid.org/0000-0002-7375-8205
                http://orcid.org/0000-0002-2996-6533
                http://orcid.org/0000-0002-2620-4939
                http://orcid.org/0000-0002-5625-4662
                Article
                255732
                10.2147/AMEP.S255732
                7457881
                32922117
                ee741acc-c807-44f9-bc67-a730ad2503db
                © 2020 Bestetti et al.

                This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution – Non Commercial (unported, v3.0) License ( http://creativecommons.org/licenses/by-nc/3.0/). By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms ( https://www.dovepress.com/terms.php).

                History
                : 27 March 2020
                : 03 August 2020
                Page count
                Figures: 3, Tables: 4, References: 32, Pages: 7
                Categories
                Original Research

                admissions test,clinical performance,problem-based learning,organized structured clinical examination,progress testing

                Comments

                Comment on this article