432
views
0
recommends
+1 Recommend
1 collections
    22
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants

      1 , * , 2 , 2 , 3

      PLoS ONE

      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          This paper presents the first meta-analysis for the inter-rater reliability (IRR) of journal peer reviews. IRR is defined as the extent to which two or more independent reviews of the same scientific document agree.

          Methodology/Principal Findings

          Altogether, 70 reliability coefficients (Cohen's Kappa, intra-class correlation [ICC], and Pearson product-moment correlation [r]) from 48 studies were taken into account in the meta-analysis. The studies were based on a total of 19,443 manuscripts; on average, each study had a sample size of 311 manuscripts (minimum: 28, maximum: 1983). The results of the meta-analysis confirmed the findings of the narrative literature reviews published to date: The level of IRR (mean ICC/r 2 = .34, mean Cohen's Kappa = .17) was low. To explain the study-to-study variation of the IRR coefficients, meta-regression analyses were calculated using seven covariates. Two covariates that emerged in the meta-regression analyses as statistically significant to gain an approximate homogeneity of the intra-class correlations indicated that, firstly, the more manuscripts that a study is based on, the smaller the reported IRR coefficients are. Secondly, if the information of the rating system for reviewers was reported in a study, then this was associated with a smaller IRR coefficient than if the information was not conveyed.

          Conclusions/Significance

          Studies that report a high level of IRR are to be considered less credible than those with a low level of IRR. According to our meta-analysis the IRR of peer assessments is quite limited and needs improvement (e.g., reader system).

          Related collections

          Most cited references 115

          • Record: found
          • Abstract: not found
          • Article: not found

          Rothstein HR

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Peer review: a flawed process at the heart of science and journals.

             Richard Smith (2006)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Advanced methods in meta-analysis: multivariate approach and meta-regression.

              This tutorial on advanced statistical methods for meta-analysis can be seen as a sequel to the recent Tutorial in Biostatistics on meta-analysis by Normand, which focused on elementary methods. Within the framework of the general linear mixed model using approximate likelihood, we discuss methods to analyse univariate as well as bivariate treatment effects in meta-analyses as well as meta-regression methods. Several extensions of the models are discussed, like exact likelihood, non-normal mixtures and multiple endpoints. We end with a discussion about the use of Bayesian methods in meta-analysis. All methods are illustrated by a meta-analysis concerning the efficacy of BCG vaccine against tuberculosis. All analyses that use approximate likelihood can be carried out by standard software. We demonstrate how the models can be fitted using SAS Proc Mixed. Copyright 2002 John Wiley & Sons, Ltd.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, USA )
                1932-6203
                2010
                14 December 2010
                : 5
                : 12
                Affiliations
                [1 ]Max Planck Society, Munich, Germany
                [2 ]Professorship for Social Psychology and Research on Higher Education, ETH Zurich, Zurich, Switzerland
                [3 ]Evaluation Office, University of Zurich, Zurich, Switzerland
                University of Glasgow, United Kingdom
                Author notes

                Conceived and designed the experiments: LB. Performed the experiments: RM. Analyzed the data: RM. Wrote the paper: LB HDD.

                Article
                10-PONE-RA-17982R1
                10.1371/journal.pone.0014331
                3001856
                21179459
                Bornmann et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
                Page count
                Pages: 10
                Categories
                Research Article
                Science Policy
                Mathematics/Statistics
                Science Policy/Education

                Uncategorized

                Comments

                Comment on this article