138
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found
      research-article

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Progress in science relies in part on generating hypotheses with existing observations and testing hypotheses with new observations. This distinction between postdiction and prediction is appreciated conceptually but is not respected in practice. Mistaking generation of postdictions with testing of predictions reduces the credibility of research findings. However, ordinary biases in human reasoning, such as hindsight bias, make it hard to avoid this mistake. An effective solution is to define the research questions and analysis plan before observing the research outcomes—a process called preregistration. Preregistration distinguishes analyses and outcomes that result from predictions from those that result from postdictions. A variety of practical strategies are available to make the best possible use of preregistration in circumstances that fall short of the ideal application, such as when the data are preexisting. Services are now available for preregistration across all disciplines, facilitating a rapid increase in the practice. Widespread adoption of preregistration will increase distinctiveness between hypothesis generation and hypothesis testing and will improve the credibility of research findings.

          Related collections

          Most cited references35

          • Record: found
          • Abstract: not found
          • Article: not found

          The ASA's Statement onp-Values: Context, Process, and Purpose

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Strong Inference: Certain systematic methods of scientific thinking may produce much more rapid progress than others.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              An Agenda for Purely Confirmatory Research.

              The veracity of substantive research claims hinges on the way experimental data are collected and analyzed. In this article, we discuss an uncomfortable fact that threatens the core of psychology's academic enterprise: almost without exception, psychologists do not commit themselves to a method of data analysis before they see the actual data. It then becomes tempting to fine tune the analysis to the data in order to obtain a desired result-a procedure that invalidates the interpretation of the common statistical tests. The extent of the fine tuning varies widely across experiments and experimenters but is almost impossible for reviewers and readers to gauge. To remedy the situation, we propose that researchers preregister their studies and indicate in advance the analyses they intend to conduct. Only these analyses deserve the label "confirmatory," and only for these analyses are the common statistical tests valid. Other analyses can be carried out but these should be labeled "exploratory." We illustrate our proposal with a confirmatory replication attempt of a study on extrasensory perception.
                Bookmark

                Author and article information

                Journal
                Proc Natl Acad Sci U S A
                Proc. Natl. Acad. Sci. U.S.A
                pnas
                pnas
                PNAS
                Proceedings of the National Academy of Sciences of the United States of America
                National Academy of Sciences
                0027-8424
                1091-6490
                13 March 2018
                13 March 2018
                13 March 2018
                : 115
                : 11
                : 2600-2606
                Affiliations
                [1] a Center for Open Science , Charlottesville, VA 22903;
                [2] bDepartment of Psychology, University of Virginia , Charlottesville, VA 22904
                Author notes
                1To whom correspondence should be addressed. Email: nosek@ 123456virginia.edu .

                Edited by Richard M. Shiffrin, Indiana University, Bloomington, IN, and approved August 28, 2017 (received for review June 15, 2017)

                Author contributions: B.A.N. designed research; B.A.N. performed research; and B.A.N., C.R.E., A.C.D., and D.T.M. wrote the paper.

                Author information
                http://orcid.org/0000-0001-6797-5476
                http://orcid.org/0000-0002-8607-2579
                http://orcid.org/0000-0002-2241-7259
                http://orcid.org/0000-0002-3125-5888
                Article
                PMC5856500 PMC5856500 5856500 201708274
                10.1073/pnas.1708274114
                5856500
                29531091
                1ade547f-8d6b-4b78-951b-8dd7c1c249ad
                Copyright @ 2018

                Published under the PNAS license.

                History
                Page count
                Pages: 7
                Funding
                Funded by: Laura and John Arnold Foundation (LJAF) 100009827
                Award ID: none
                Funded by: National Institute of Aging
                Award ID: R24AG048124
                Categories
                513
                Sackler Colloquium on Improving the Reproducibility of Scientific Research
                Colloquium Papers
                Social Sciences
                Psychological and Cognitive Sciences
                Sackler Colloquium on Improving the Reproducibility of Scientific Research

                preregistration,exploratory analysis,confirmatory analysis,open science,methodology

                Comments

                Comment on this article