Blog
About

  • Record: found
  • Abstract: found
  • Article: found
Is Open Access

The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future

Read this article at

Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Abstract

      Review, promotion, and tenure (RPT) processes significantly affect how faculty direct their own career and scholarly progression. Although RPT practices vary between and within institutions, and affect various disciplines, ranks, institution types, genders, and ethnicity in different ways, some consistent themes emerge when investigating what faculty would like to change about RPT. For instance, over the last few decades, RPT processes have generally increased the value placed on research, at the expense of teaching and service, which often results in an incongruity between how faculty actually spend their time vs. what is considered in their evaluation. Another issue relates to publication practices: most agree RPT requirements should encourage peer-reviewed works of high quality, but in practice, the value of publications is often assessed using shortcuts such as the prestige of the publication venue, rather than on the quality and rigor of peer review of each individual item. Open access and online publishing have made these issues even murkier due to misconceptions about peer review practices and concerns about predatory online publishers, which leaves traditional publishing formats the most desired despite their restricted circulation. And, efforts to replace journal-level measures such as the impact factor with more precise article-level metrics (e.g., citation counts and altmetrics) have been slow to integrate with the RPT process. Questions remain as to whether, or how, RPT practices should be changed to better reflect faculty work patterns and reduce pressure to publish in only the most prestigious traditional formats. To determine the most useful way to change RPT, we need to assess further the needs and perceptions of faculty and administrators, and gain a better understanding of the level of influence of written RPT guidelines and policy in an often vague process that is meant to allow for flexibility in assessing individuals.

      Related collections

      Most cited references 78

      • Record: found
      • Abstract: found
      • Article: not found

      PSYCHOLOGY. Estimating the reproducibility of psychological science.

      Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
        Bookmark
        • Record: found
        • Abstract: not found
        • Article: not found

        Bibliometrics: The Leiden Manifesto for research metrics.

          Bookmark
          • Record: found
          • Abstract: not found
          • Article: not found

          Peer review: a flawed process at the heart of science and journals.

           Richard Smith (2006)
            Bookmark

            Author and article information

            Affiliations
            [1 ]ScholCommLab, Simon Fraser University, Vancouver, BC, V6B 5K3, Canada
            [2 ]School of Publishing, Simon Fraser University, Vancouver, BC, V6B 5K3, Canada
            [1 ]New York City College of Technology (CUNY), New York City, NY, USA
            [1 ]Open Science MOOC, Berlin, Germany
            [2 ]IGDORE, Berlin, Germany
            Author notes

            No competing interests were disclosed.

            Contributors
            Role: Conceptualization, Role: Investigation, Role: Methodology, Role: Writing – Original Draft Preparation
            Role: Funding Acquisition, Role: Supervision, Role: Writing – Review & Editing, ORCID: https://orcid.org/0000-0002-9344-7439
            Journal
            F1000Res
            F1000Res
            F1000Research
            F1000Research
            F1000 Research Limited (London, UK )
            2046-1402
            5 October 2018
            2018
            : 7
            6325612
            10.12688/f1000research.16493.1
            Copyright: © 2018 Schimanski LA and Alperin JP

            This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

            Product
            Funding
            Funded by: Open Society Foundations
            Award ID: OR2016-29841
            This study was supported by the Open Society Foundations [OR2016-29841].
            The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
            Categories
            Review
            Articles

            Comments

            Comment on this article