102
views
1
recommends
+1 Recommend
1 collections
    4
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

          Abstract

          A new article-level metric, the Relative Citation Ratio, provides an alternative to the use of journal impact factors as a means of identifying influential papers.

          Author Summary

          Academic researchers convey their discoveries to the scientific community by publishing papers in scholarly journals. In the biomedical sciences alone, this process now generates more than one million new reports each year. The sheer volume of available information, together with the increasing specialization of many scientists, has contributed to the adoption of metrics, including journal impact factor and h-index, as signifiers of a researcher’s productivity or the significance of his or her work. Scientists and administrators agree that the use of these metrics is problematic, but in spite of this strong consensus, such judgments remain common practice, suggesting the need for a valid alternative. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network—that is, the other papers that appear alongside it in reference lists—to field-normalize the number of times it has been cited, generating a Relative Citation Ratio (RCR). Since choosing to cite is the long-standing way in which scholars acknowledge the relevance of each other’s work, RCR can provide valuable supplemental information, either to decision makers at funding agencies or to others who seek to understand the relative outcomes of different groups of research investments.

          Related collections

          Most cited references34

          • Record: found
          • Abstract: not found
          • Article: not found

          The history and meaning of the journal impact factor.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Experimental study of inequality and unpredictability in an artificial cultural market.

            Hit songs, books, and movies are many times more successful than average, suggesting that "the best" alternatives are qualitatively different from "the rest"; yet experts routinely fail to predict which products will succeed. We investigated this paradox experimentally, by creating an artificial "music market" in which 14,341 participants downloaded previously unknown songs either with or without knowledge of previous participants' choices. Increasing the strength of social influence increased both inequality and unpredictability of success. Success was also only partly determined by quality: The best songs rarely did poorly, and the worst rarely did well, but any other result was possible.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Quantifying Long-Term Scientific Impact

              The lack of predictability of citation-based measures frequently used to gauge impact, from impact factors to short-term citations, raises a fundamental question: Is there long-term predictability in citation patterns? Here, we derive a mechanistic model for the citation dynamics of individual papers, allowing us to collapse the citation histories of papers from different journals and disciplines into a single curve, indicating that all papers tend to follow the same universal temporal pattern. The observed patterns not only help us uncover basic mechanisms that govern scientific impact but also offer reliable measures of influence that may have potential policy implications.
                Bookmark

                Author and article information

                Contributors
                Role: Academic Editor
                Journal
                PLoS Biol
                PLoS Biol
                plos
                plosbiol
                PLoS Biology
                Public Library of Science (San Francisco, CA USA )
                1544-9173
                1545-7885
                6 September 2016
                September 2016
                6 September 2016
                : 14
                : 9
                : e1002541
                Affiliations
                [1 ]Office of Portfolio Analysis, National Institutes of Health, Bethesda, Maryland, United States of America
                [2 ]Division of Program Coordination, Planning, and Strategic Initiatives, National Institutes of Health, Bethesda, Maryland, United States of America
                Walter and Eliza Hall Institute of Medical Research, AUSTRALIA
                Author notes

                Since the authors work in the Division of Program Coordination, Planning, and Strategic Initiatives at the National Institutes of Health, our work could have policy implications for how research portfolios are evaluated.

                • Conceived and designed the experiments: BIH GMS JMA.

                • Performed the experiments: BIH XY GMS.

                • Analyzed the data: BIH GMS.

                • Contributed reagents/materials/analysis tools: BIH XY.

                • Wrote the paper: BIH XY JMA GMS.

                Author information
                http://orcid.org/0000-0003-2765-7996
                Article
                PBIOLOGY-D-15-03348
                10.1371/journal.pbio.1002541
                5012559
                27599104
                59cbe82f-e25d-4a6d-a95d-805c25f12a43

                This is an open access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.

                History
                : 7 December 2015
                : 1 August 2016
                Page count
                Figures: 8, Tables: 2, Pages: 25
                Funding
                The authors received no specific funding for this work but are all employees of the USA National Institutes of Health.
                Categories
                Meta-Research Article
                Research and Analysis Methods
                Research Assessment
                Bibliometrics
                People and Places
                Population Groupings
                Professions
                Scientists
                Science Policy
                Research Funding
                Research Grants
                Science Policy
                Research Funding
                Research and Analysis Methods
                Mathematical and Statistical Techniques
                Statistical Methods
                Regression Analysis
                Linear Regression Analysis
                Physical Sciences
                Mathematics
                Statistics (Mathematics)
                Statistical Methods
                Regression Analysis
                Linear Regression Analysis
                Physical Sciences
                Mathematics
                Probability Theory
                Statistical Distributions
                Biology and Life Sciences
                Microbiology
                Medical Microbiology
                Microbiome
                Biology and Life Sciences
                Genetics
                Genomics
                Microbial Genomics
                Microbiome
                Biology and Life Sciences
                Microbiology
                Microbial Genomics
                Microbiome
                Biology and Life Sciences
                Biochemistry
                Custom metadata
                We have developed a web tool (iCite) that calculates RCR and provides associated metrics at https://icite.od.nih.gov; the underlying data for all of the figures and the full code for generating the iCite database and tool are posted to http://github.com/NIHOPA.

                Life sciences
                Life sciences

                Comments

                Comment on this article