74
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found

      Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis

      1 , 2
      Annual Review of Psychology
      Annual Reviews

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

          Related collections

          Most cited references64

          • Record: found
          • Abstract: not found
          • Article: not found

          One Hundred Years of Social Psychology Quantitatively Described.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Puzzlingly High Correlations in fMRI Studies of Emotion, Personality, and Social Cognition.

            Functional magnetic resonance imaging (fMRI) studiesofemotion, personality, and social cognition have drawn much attention in recent years, with high-profile studies frequently reporting extremely high (e.g., >.8) correlations between brain activation and personality measures. We show that these correlations are higher than should be expected given the (evidently limited) reliability of both fMRI and personality measures. The high correlations are all the more puzzling because method sections rarely contain much detail about how the correlations were obtained. We surveyed authors of 55 articles that reported findings of this kind to determine a few details on how these correlations were computed. More than half acknowledged using a strategy that computes separate correlations for individual voxels and reports means of only those voxels exceeding chosen thresholds. We show how this nonindependent analysis inflates correlations while yielding reassuring-looking scattergrams. This analysis technique was used to obtain the vast majority of the implausibly high correlations in our survey sample. In addition, we argue that, in some cases, other analysis problems likely created entirely spurious correlations. We outline how the data from these studies could be reanalyzed with unbiased methods to provide accurate estimates of the correlations in question and urge authors to perform such reanalyses. The underlying problems described here appear to be common in fMRI research of many kinds-not just in studies of emotion, personality, and social cognition.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              An Agenda for Purely Confirmatory Research.

              The veracity of substantive research claims hinges on the way experimental data are collected and analyzed. In this article, we discuss an uncomfortable fact that threatens the core of psychology's academic enterprise: almost without exception, psychologists do not commit themselves to a method of data analysis before they see the actual data. It then becomes tempting to fine tune the analysis to the data in order to obtain a desired result-a procedure that invalidates the interpretation of the common statistical tests. The extent of the fine tuning varies widely across experiments and experimenters but is almost impossible for reviewers and readers to gauge. To remedy the situation, we propose that researchers preregister their studies and indicate in advance the analyses they intend to conduct. Only these analyses deserve the label "confirmatory," and only for these analyses are the common statistical tests valid. Other analyses can be carried out but these should be labeled "exploratory." We illustrate our proposal with a confirmatory replication attempt of a study on extrasensory perception.
                Bookmark

                Author and article information

                Journal
                Annual Review of Psychology
                Annu. Rev. Psychol.
                Annual Reviews
                0066-4308
                1545-2085
                January 04 2018
                January 04 2018
                : 69
                : 1
                : 487-510
                Affiliations
                [1 ]Department of Psychology, New York University, New York, New York 10003;
                [2 ]Department of Psychology and Human Development, Peabody College, Vanderbilt University, Nashville, Tennessee 37205;
                Article
                10.1146/annurev-psych-122216-011845
                29300688
                66790768-3b21-4430-802d-90599d38500b
                © 2018
                History

                Sociology,Social policy & Welfare,Political science,Psychology,Development studies,Public health

                Comments

                Comment on this article