13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Citation counts and journal impact factors do not capture some indicators of research quality in the behavioural and brain sciences

      research-article
      1 , 2 ,
      Royal Society Open Science
      The Royal Society
      citation counts, bibliometrics, research quality, open science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Citation data and journal impact factors are important components of faculty dossiers and figure prominently in both promotion decisions and assessments of a researcher’s broader societal impact. Although these metrics play a large role in high-stakes decisions, the evidence is mixed about whether they are strongly correlated with indicators of research quality. We use data from a large-scale dataset comprising 45 144 journal articles with 667 208 statistical tests and data from 190 replication attempts to assess whether citation counts and impact factors predict three indicators of research quality: (i) the accuracy of statistical reporting, (ii) the evidential value of the reported data and (iii) the replicability of a given experimental result. Both citation counts and impact factors were weak and inconsistent predictors of research quality, so defined, and sometimes negatively related to quality. Our findings raise the possibility that citation data and impact factors may be of limited utility in evaluating scientists and their research. We discuss the implications of these findings in light of current incentive structures and discuss alternative approaches to evaluating research.

          Related collections

          Most cited references67

          • Record: found
          • Abstract: not found
          • Article: not found

          brms: An R Package for Bayesian Multilevel Models Using Stan

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            PSYCHOLOGY. Estimating the reproducibility of psychological science.

            Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              THE BIG FIVE PERSONALITY DIMENSIONS AND JOB PERFORMANCE: A META-ANALYSIS

                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Data curationRole: Formal analysisRole: InvestigationRole: MethodologyRole: Project administrationRole: ResourcesRole: SupervisionRole: ValidationRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Role: ConceptualizationRole: Formal analysisRole: InvestigationRole: MethodologyRole: Project administrationRole: ValidationRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Journal
                R Soc Open Sci
                R Soc Open Sci
                RSOS
                royopensci
                Royal Society Open Science
                The Royal Society
                2054-5703
                August 17, 2022
                August 2022
                August 17, 2022
                : 9
                : 8
                : 220334
                Affiliations
                [ 1 ] Department of Psychology, University of Maryland, , College Park, MD, USA
                [ 2 ] Department of Psychology, University of Edinburgh, , Edinburgh, UK
                Author information
                http://orcid.org/0000-0001-9547-1937
                http://orcid.org/0000-0001-6629-2040
                Article
                rsos220334
                10.1098/rsos.220334
                9382220
                35991336
                6789f718-1674-48c8-a7e4-6a6963561258
                © 2022 The Authors.

                Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.

                History
                : March 21, 2022
                : July 22, 2022
                Categories
                1001
                205
                Psychology and Cognitive Neuroscience
                Research Articles

                citation counts,bibliometrics,research quality,open science

                Comments

                Comment on this article