+1 Recommend
1 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Review, promotion, and tenure (RPT) processes significantly affect how faculty direct their own career and scholarly progression. Although RPT practices vary between and within institutions, and affect various disciplines, ranks, institution types, genders, and ethnicity in different ways, some consistent themes emerge when investigating what faculty would like to change about RPT. For instance, over the last few decades, RPT processes have generally increased the value placed on research, at the expense of teaching and service, which often results in an incongruity between how faculty actually spend their time vs. what is considered in their evaluation. Another issue relates to publication practices: most agree RPT requirements should encourage peer-reviewed works of high quality, but in practice, the value of publications is often assessed using shortcuts such as the prestige of the publication venue, rather than on the quality and rigor of peer review of each individual item. Open access and online publishing have made these issues even murkier due to misconceptions about peer review practices and concerns about predatory online publishers, which leaves traditional publishing formats the most desired despite their restricted circulation. And, efforts to replace journal-level measures such as the impact factor with more precise article-level metrics (e.g., citation counts and altmetrics) have been slow to integrate with the RPT process. Questions remain as to whether, or how, RPT practices should be changed to better reflect faculty work patterns and reduce pressure to publish in only the most prestigious traditional formats. To determine the most useful way to change RPT, we need to assess further the needs and perceptions of faculty and administrators, and gain a better understanding of the level of influence of written RPT guidelines and policy in an often vague process that is meant to allow for flexibility in assessing individuals.

          Related collections

          Most cited references 78

          • Record: found
          • Abstract: found
          • Article: not found

          PSYCHOLOGY. Estimating the reproducibility of psychological science.

          Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
            • Record: found
            • Abstract: not found
            • Article: not found

            Bibliometrics: The Leiden Manifesto for research metrics.

              • Record: found
              • Abstract: not found
              • Article: not found

              Peer review: a flawed process at the heart of science and journals.

               Richard Smith (2006)

                Author and article information

                Role: ConceptualizationRole: InvestigationRole: MethodologyRole: Writing – Original Draft Preparation
                Role: Funding AcquisitionRole: SupervisionRole: Writing – Review & Editing
                F1000 Research Limited (London, UK )
                5 October 2018
                : 7
                [1 ]ScholCommLab, Simon Fraser University, Vancouver, BC, V6B 5K3, Canada
                [2 ]School of Publishing, Simon Fraser University, Vancouver, BC, V6B 5K3, Canada
                [1 ]New York City College of Technology (CUNY), New York City, NY, USA
                [1 ]Open Science MOOC, Berlin, Germany
                [2 ]IGDORE, Berlin, Germany
                Author notes

                No competing interests were disclosed.

                Competing interests: No competing interests were disclosed.

                Competing interests: No competing interests were disclosed.

                Copyright: © 2018 Schimanski LA and Alperin JP

                This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                Funded by: Open Society Foundations
                Award ID: OR2016-29841
                This study was supported by the Open Society Foundations [OR2016-29841].
                The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.


                Comment on this article