16
views
0
recommends
+1 Recommend
2 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How to Detect Publication Bias in Psychological Research : A Comparative Evaluation of Six Statistical Methods

      research-article

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Abstract. Publication biases and questionable research practices are assumed to be two of the main causes of low replication rates. Both of these problems lead to severely inflated effect size estimates in meta-analyses. Methodologists have proposed a number of statistical tools to detect such bias in meta-analytic results. We present an evaluation of the performance of six of these tools. To assess the Type I error rate and the statistical power of these methods, we simulated a large variety of literatures that differed with regard to true effect size, heterogeneity, number of available primary studies, and sample sizes of these primary studies; furthermore, simulated studies were subjected to different degrees of publication bias. Our results show that across all simulated conditions, no method consistently outperformed the others. Additionally, all methods performed poorly when true effect sizes were heterogeneous or primary studies had a small chance of being published, irrespective of their results. This suggests that in many actual meta-analyses in psychology, bias will remain undiscovered no matter which detection method is used.

          Related collections

          Most cited references32

          • Record: found
          • Abstract: not found
          • Article: not found

          Measuring inconsistency in meta-analyses.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            “Positive” Results Increase Down the Hierarchy of the Sciences

            The hypothesis of a Hierarchy of the Sciences with physical sciences at the top, social sciences at the bottom, and biological sciences in-between is nearly 200 years old. This order is intuitive and reflected in many features of academic life, but whether it reflects the “hardness” of scientific research—i.e., the extent to which research questions and results are determined by data and theories as opposed to non-cognitive factors—is controversial. This study analysed 2434 papers published in all disciplines and that declared to have tested a hypothesis. It was determined how many papers reported a “positive” (full or partial) or “negative” support for the tested hypothesis. If the hierarchy hypothesis is correct, then researchers in “softer” sciences should have fewer constraints to their conscious and unconscious biases, and therefore report more positive outcomes. Results confirmed the predictions at all levels considered: discipline, domain and methodology broadly defined. Controlling for observed differences between pure and applied disciplines, and between papers testing one or several hypotheses, the odds of reporting a positive result were around 5 times higher among papers in the disciplines of Psychology and Psychiatry and Economics and Business compared to Space Science, 2.3 times higher in the domain of social sciences compared to the physical sciences, and 3.4 times higher in studies applying behavioural and social methodologies on people compared to physical and chemical studies on non-biological material. In all comparisons, biological studies had intermediate values. These results suggest that the nature of hypotheses tested and the logical and methodological rigour employed to test them vary systematically across disciplines and fields, depending on the complexity of the subject matter and possibly other factors (e.g., a field's level of historical and/or intellectual development). On the other hand, these results support the scientific status of the social sciences against claims that they are completely subjective, by showing that, when they adopt a scientific approach to discovery, they differ from the natural sciences only by a matter of degree.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              p-Curve and Effect Size: Correcting for Publication Bias Using Only Significant Results.

              Journals tend to publish only statistically significant evidence, creating a scientific record that markedly overstates the size of effects. We provide a new tool that corrects for this bias without requiring access to nonsignificant results. It capitalizes on the fact that the distribution of significant p values, p-curve, is a function of the true underlying effect. Researchers armed only with sample sizes and test results of the published findings can correct for publication bias. We validate the technique with simulations and by reanalyzing data from the Many-Labs Replication project. We demonstrate that p-curve can arrive at conclusions opposite that of existing tools by reanalyzing the meta-analysis of the "choice overload" literature.
                Bookmark

                Author and article information

                Contributors
                Journal
                zfp
                Zeitschrift für Psychologie
                Hogrefe Publishing
                2190-8370
                2151-2604
                December 20, 2019
                2019
                : 227
                : 4 , Topical Issue: Open Science in Psychology: Progress and Yet Unsolved Problems
                : 261-279
                Affiliations
                [ 1 ]Department of Psychology, University of Erfurt, Germany
                Author notes
                Frank Renkewitz, Department of Psychology, University of Erfurt, Nordhäuser Str. 63, 99089 Erfurt, Germany, frank.renkewitz@ 123456uni-erfurt.de
                Article
                zfp_227_4_261
                10.1027/2151-2604/a000386
                bd834fb8-db0e-4721-8c6b-bce5c475801e
                Distributed under the Hogrefe OpenMind License https://doi.org/10.1027/a000001
                History
                : December 1, 2018
                : July 9, 2019
                : August 8, 2019
                Categories
                Original Article

                Psychology,General behavioral science
                meta-analysis,heterogeneity,bias detection,optional stopping,publication bias

                Comments

                Comment on this article