83
views
0
recommends
+1 Recommend
1 collections
    1
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Global Burden of Journal Peer Review in the Biomedical Literature: Strong Imbalance in the Collective Enterprise

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The growth in scientific production may threaten the capacity for the scientific community to handle the ever-increasing demand for peer review of scientific publications. There is little evidence regarding the sustainability of the peer-review system and how the scientific community copes with the burden it poses. We used mathematical modeling to estimate the overall quantitative annual demand for peer review and the supply in biomedical research. The modeling was informed by empirical data from various sources in the biomedical domain, including all articles indexed at MEDLINE. We found that for 2015, across a range of scenarios, the supply exceeded by 15% to 249% the demand for reviewers and reviews. However, 20% of the researchers performed 69% to 94% of the reviews. Among researchers actually contributing to peer review, 70% dedicated 1% or less of their research work-time to peer review while 5% dedicated 13% or more of it. An estimated 63.4 million hours were devoted to peer review in 2015, among which 18.9 million hours were provided by the top 5% contributing reviewers. Our results support that the system is sustainable in terms of volume but emphasizes a considerable imbalance in the distribution of the peer-review effort across the scientific community. Finally, various individual interactions between authors, editors and reviewers may reduce to some extent the number of reviewers who are available to editors at any point.

          Related collections

          Most cited references20

          • Record: found
          • Abstract: found
          • Article: not found

          Measuring the effectiveness of scientific gatekeeping.

          Peer review is the main institution responsible for the evaluation and gestation of scientific research. Although peer review is widely seen as vital to scientific evaluation, anecdotal evidence abounds of gatekeeping mistakes in leading journals, such as rejecting seminal contributions or accepting mediocre submissions. Systematic evidence regarding the effectiveness--or lack thereof--of scientific gatekeeping is scant, largely because access to rejected manuscripts from journals is rarely available. Using a dataset of 1,008 manuscripts submitted to three elite medical journals, we show differences in citation outcomes for articles that received different appraisals from editors and peer reviewers. Among rejected articles, desk-rejected manuscripts, deemed as unworthy of peer review by editors, received fewer citations than those sent for peer review. Among both rejected and accepted articles, manuscripts with lower scores from peer reviewers received relatively fewer citations when they were eventually published. However, hindsight reveals numerous questionable gatekeeping decisions. Of the 808 eventually published articles in our dataset, our three focal journals rejected many highly cited manuscripts, including the 14 most popular; roughly the top 2 percent. Of those 14 articles, 12 were desk-rejected. This finding raises concerns regarding whether peer review is ill--suited to recognize and gestate the most impactful ideas and research. Despite this finding, results show that in our case studies, on the whole, there was value added in peer review. Editors and peer reviewers generally--but not always-made good decisions regarding the identification and promotion of quality in scientific manuscripts.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Emerging trends in peer review—a survey

            “Classical peer review” has been subject to intense criticism for slowing down the publication process, bias against specific categories of paper and author, unreliability, inability to detect errors and fraud, unethical practices, and the lack of recognition for unpaid reviewers. This paper surveys innovative forms of peer review that attempt to address these issues. Based on an initial literature review, we construct a sample of 82 channels of scientific communication covering all forms of review identified by the survey, and analyze the review mechanisms used by each channel. We identify two major trends: the rapidly expanding role of preprint servers (e.g., ArXiv) that dispense with traditional peer review altogether, and the growth of “non-selective review,” focusing on papers' scientific quality rather than their perceived importance and novelty. Other potentially important developments include forms of “open review,” which remove reviewer anonymity, and interactive review, as well as new mechanisms for post-publication review and out-of-channel reader commentary, especially critical commentary targeting high profile papers. One of the strongest findings of the survey is the persistence of major differences between the peer review processes used by different disciplines. None of these differences is likely to disappear in the foreseeable future. The most likely scenario for the coming years is thus continued diversification, in which different review mechanisms serve different author, reader, and publisher needs. Relatively little is known about the impact of these innovations on the problems they address. These are important questions for future quantitative research.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Measuring the quality of editorial peer review.

              The quality of a process can only be tested against its agreed objectives. Editorial peer-review is widely used, yet there appears to be little agreement about how to measure its effects or processes. To identify outcome measures used to assess editorial peer review as performed by biomedical journals, we analyzed studies identified from 2 systematic reviews that measured the effects of editorial peer review on the quality of the output (ie, published articles) or of the process itself (eg, reviewers' comments). Ten studies used a variety of instruments to assess the quality of articles that had undergone peer review. Only 1, nonrandomized study compared the quality of articles published in peer-reviewed and non-peer-reviewed journals. The others measured the effects of variations in the peer-review process or used a before-and-after design to measure the effects of standard peer review on accepted articles. Eighteen studies measured the quality of reviewers' reports under different conditions such as blinding or after training. One study compared the time and cost of different review processes. Until we have properly defined the objectives of peer-review, it will remain almost impossible to assess or improve its effectiveness. The research needed to understand the broader effects of peer review poses many methodologic problems and would require the cooperation of many parts of the scientific community.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                10 November 2016
                2016
                : 11
                : 11
                : e0166387
                Affiliations
                [1 ]INSERM U1153, Paris, France
                [2 ]Université Paris Descartes–Sorbonne Paris cité, Paris, France
                [3 ]Assistance Publique-Hôpitaux de Paris, Hôpital Hôtel-Dieu, Centre d’Epidémiologie Clinique, Paris, France
                [4 ]Cochrane France, Paris, France
                [5 ]Department of Epidemiology, Columbia University Mailman School of Public Health, New York, New York, United States of America
                GERMANY
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                • Conceptualization: MK RP PR LT.

                • Data curation: MK.

                • Formal analysis: MK.

                • Funding acquisition: PR.

                • Investigation: MK LT.

                • Methodology: MK RP PR LT.

                • Project administration: PR.

                • Resources: PR.

                • Software: MK.

                • Supervision: RP LT.

                • Validation: RP PR LT.

                • Visualization: MK LT.

                • Writing – original draft: MK LT.

                • Writing – review & editing: RP PR.

                Author information
                http://orcid.org/0000-0001-7783-9769
                Article
                PONE-D-16-27699
                10.1371/journal.pone.0166387
                5104353
                27832157
                27e9a171-5a35-4fee-872c-173bb4922747
                © 2016 Kovanis et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 11 July 2016
                : 27 October 2016
                Page count
                Figures: 3, Tables: 0, Pages: 14
                Funding
                Funded by: Sorbonne Paris cité
                Award Recipient :
                Michail Kovanis is the recipient of a PhD grant from Sorbonne Paris Cité.
                Categories
                Research Article
                Research and Analysis Methods
                Research Assessment
                Peer Review
                Social Sciences
                Economics
                Economic Models
                Supply and Demand
                Ecology and Environmental Sciences
                Sustainability Science
                Research and Analysis Methods
                Simulation and Modeling
                Mathematical Modeling
                Computer and Information Sciences
                Information Technology
                Natural Language Processing
                Named Entity Recognition
                Entity Disambiguation
                Biology and Life Sciences
                Population Biology
                Population Dynamics
                Geographic Distribution
                Science Policy
                Open Science
                Open Access
                Research and Analysis Methods
                Scientific Publishing
                Publication Practices
                Open Access
                People and Places
                Population Groupings
                Professions
                Scientists
                Custom metadata
                All data and analytical results can be found in the accompanying Excel file: http://www.clinicalepidemio.fr/peerreview_burden/. All code is available on github: https://github.com/kovanostra/global-burden-of-peer-review.

                Uncategorized
                Uncategorized

                Comments

                Comment on this article