157
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results

      research-article
      * , ,
      PLoS ONE
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          The widespread reluctance to share published research data is often hypothesized to be due to the authors' fear that reanalysis may expose errors in their work or may produce conclusions that contradict their own. However, these hypotheses have not previously been studied systematically.

          Methods and Findings

          We related the reluctance to share research data for reanalysis to 1148 statistically significant results reported in 49 papers published in two major psychology journals. We found the reluctance to share data to be associated with weaker evidence (against the null hypothesis of no effect) and a higher prevalence of apparent errors in the reporting of statistical results. The unwillingness to share data was particularly clear when reporting errors had a bearing on statistical significance.

          Conclusions

          Our findings on the basis of psychological papers suggest that statistical results are particularly hard to verify when reanalysis is more likely to lead to contrasting conclusions. This highlights the importance of establishing mandatory data archiving policies.

          Related collections

          Most cited references49

          • Record: found
          • Abstract: found
          • Article: not found

          Sharing Detailed Research Data Is Associated with Increased Citation Rate

          Background Sharing research data provides benefit to the general scientific community, but the benefit is less obvious for the investigator who makes his or her data available. Principal Findings We examined the citation history of 85 cancer microarray clinical trial publications with respect to the availability of their data. The 48% of trials with publicly available microarray data received 85% of the aggregate citations. Publicly available data was significantly (p = 0.006) associated with a 69% increase in citations, independently of journal impact factor, date of publication, and author country of origin using linear regression. Significance This correlation between publicly available data and increased literature impact may further motivate investigators to share their detailed research data.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Null hypothesis significance testing: a review of an old and continuing controversy.

            Null hypothesis significance testing (NHST) is arguably the most widely used approach to hypothesis evaluation among behavioral and social scientists. It is also very controversial. A major concern expressed by critics is that such testing is misunderstood by many of those who use it. Several other objections to its use have also been raised. In this article the author reviews and comments on the claimed misunderstandings as well as on other criticisms of the approach, and he notes arguments that have been advanced in support of NHST. Alternatives and supplements to NHST are considered, as are several related recommendations regarding the interpretation of experimental data. The concluding opinion is that NHST is easily misunderstood and misused but that when applied with good judgment it can be an effective aid to the interpretation of experimental data.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Publication prejudices: An experimental study of confirmatory bias in the peer review system

                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, USA )
                1932-6203
                2011
                2 November 2011
                : 6
                : 11
                : e26828
                Affiliations
                [1]Psychology Department, Faculty of Social and Behavioral Sciences, University of Amsterdam, Amsterdam, The Netherlands
                Georgetown University Medical Center, United States of America
                Author notes

                Conceived and designed the experiments: JMW. Performed the experiments: JMW MB DM. Analyzed the data: JMW MB DM. Wrote the paper: JMW.

                Article
                PONE-D-11-09722
                10.1371/journal.pone.0026828
                3206853
                22073203
                9e3449fd-9802-475d-86d0-3343857b8627
                Wicherts et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
                History
                : 20 May 2011
                : 4 October 2011
                Page count
                Pages: 7
                Categories
                Research Article
                Mathematics
                Statistics
                Statistical Methods
                Science Policy
                Research Assessment
                Research Integrity
                Publication Ethics

                Uncategorized
                Uncategorized

                Comments

                Comment on this article