97
views
1
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The influence of journal submission guidelines on authors' reporting of statistics and use of open research practices

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          From January 2014, Psychological Science introduced new submission guidelines that encouraged the use of effect sizes, estimation, and meta-analysis (the “new statistics”), required extra detail of methods, and offered badges for use of open science practices. We investigated the use of these practices in empirical articles published by Psychological Science and, for comparison, by the Journal of Experimental Psychology: General, during the period of January 2013 to December 2015. The use of null hypothesis significance testing (NHST) was extremely high at all times and in both journals. In Psychological Science, the use of confidence intervals increased markedly overall, from 28% of articles in 2013 to 70% in 2015, as did the availability of open data (3 to 39%) and open materials (7 to 31%). The other journal showed smaller or much smaller changes. Our findings suggest that journal-specific submission guidelines may encourage desirable changes in authors’ practices.

          Related collections

          Most cited references21

          • Record: found
          • Abstract: found
          • Article: not found

          jsPsych: a JavaScript library for creating behavioral experiments in a Web browser.

          Online experiments are growing in popularity, and the increasing sophistication of Web technology has made it possible to run complex behavioral experiments online using only a Web browser. Unlike with offline laboratory experiments, however, few tools exist to aid in the development of browser-based experiments. This makes the process of creating an experiment slow and challenging, particularly for researchers who lack a Web development background. This article introduces jsPsych, a JavaScript library for the development of Web-based experiments. jsPsych formalizes a way of describing experiments that is much simpler than writing the entire experiment from scratch. jsPsych then executes these descriptions automatically, handling the flow from one task to another. The jsPsych library is open-source and designed to be expanded by the research community. The project is available online at www.jspsych.org .
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency

            Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among comparison journals. Moreover, reporting openness does not guarantee openness. When badges were earned, reportedly available data were more likely to be actually available, correct, usable, and complete than when badges were not earned. Open materials also increased to a weaker degree, and there was more variability among comparison journals. Badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A Vast Graveyard of Undead Theories: Publication Bias and Psychological Science's Aversion to the Null.

              Publication bias remains a controversial issue in psychological science. The tendency of psychological science to avoid publishing null results produces a situation that limits the replicability assumption of science, as replication cannot be meaningful without the potential acknowledgment of failed replications. We argue that the field often constructs arguments to block the publication and interpretation of null results and that null results may be further extinguished through questionable researcher practices. Given that science is dependent on the process of falsification, we argue that these problems reduce psychological science's capability to have a proper mechanism for theory falsification, thus resulting in the promulgation of numerous "undead" theories that are ideologically popular but have little basis in fact.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                17 April 2017
                2017
                : 12
                : 4
                : e0175583
                Affiliations
                [1 ]School of Natural Sciences and Psychology, Liverpool John Moores University, Liverpool, United Kingdom
                [2 ]School of Psychology and Public Health, La Trobe University, Melbourne, Victoria, Australia
                [3 ]Department of General Psychology, University of Padova, Padova, Italy
                [4 ]Institute of Psychology, Health and Society, University of Liverpool, Liverpool, United Kingdom
                Tilburg University, NETHERLANDS
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                • Conceptualization: DG PT GC IB.

                • Data curation: DG PT LF.

                • Formal analysis: DG PT.

                • Investigation: DG PT LF.

                • Methodology: DG PT GC.

                • Project administration: DG PT GC IB.

                • Resources: DG PT.

                • Supervision: DG PT GC IB.

                • Validation: DG PT GC IB.

                • Visualization: DG PT GC IB.

                • Writing – original draft: DG PT GC IB.

                • Writing – review & editing: DG PT GC IB.

                Author information
                http://orcid.org/0000-0002-1992-6271
                Article
                PONE-D-16-50903
                10.1371/journal.pone.0175583
                5393581
                28414751
                7626c4ad-4642-4640-9fb4-b7121b57d60c
                © 2017 Giofrè et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 24 December 2016
                : 28 March 2017
                Page count
                Figures: 2, Tables: 2, Pages: 15
                Funding
                The author(s) received no specific funding for this work.
                Categories
                Research Article
                Research and Analysis Methods
                Mathematical and Statistical Techniques
                Statistical Methods
                Meta-Analysis
                Physical Sciences
                Mathematics
                Statistics (Mathematics)
                Statistical Methods
                Meta-Analysis
                Physical Sciences
                Mathematics
                Statistics (Mathematics)
                Confidence Intervals
                Science Policy
                Open Science
                Biology and Life Sciences
                Psychology
                Experimental Psychology
                Social Sciences
                Psychology
                Experimental Psychology
                Science Policy
                Open Science
                Open Data
                Computer and Information Sciences
                Data Management
                Research and Analysis Methods
                Research Assessment
                Peer Review
                Research and Analysis Methods
                Research Assessment
                Research Reporting Guidelines
                Custom metadata
                Data are from the https://figshare.com/s/dd5756ad2826360e2ebf study whose authors may be contacted at david.giofre@ 123456gmail.com .

                Uncategorized
                Uncategorized

                Comments

                Comment on this article