73
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Access to data is a critical feature of an efficient, progressive and ultimately self-correcting scientific ecosystem. But the extent to which in-principle benefits of data sharing are realized in practice is unclear. Crucially, it is largely unknown whether published findings can be reproduced by repeating reported analyses upon shared data (‘analytic reproducibility’). To investigate this, we conducted an observational evaluation of a mandatory open data policy introduced at the journal Cognition. Interrupted time-series analyses indicated a substantial post-policy increase in data available statements (104/417, 25% pre-policy to 136/174, 78% post-policy), although not all data appeared reusable (23/104, 22% pre-policy to 85/136, 62%, post-policy). For 35 of the articles determined to have reusable data, we attempted to reproduce 1324 target values. Ultimately, 64 values could not be reproduced within a 10% margin of error. For 22 articles all target values were reproduced, but 11 of these required author assistance. For 13 articles at least one value could not be reproduced despite author assistance. Importantly, there were no clear indications that original conclusions were seriously impacted. Mandatory open data policies can increase the frequency and quality of data sharing. However, suboptimal data curation, unclear analysis specification and reporting errors can impede analytic reproducibility, undermining the utility of data sharing and the credibility of scientific findings.

          Related collections

          Most cited references33

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Interrupted time series regression for the evaluation of public health interventions: a tutorial

          Abstract Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency

            Beginning January 2014, Psychological Science gave authors the opportunity to signal open data and materials if they qualified for badges that accompanied published articles. Before badges, less than 3% of Psychological Science articles reported open data. After badges, 23% reported open data, with an accelerating trend; 39% reported open data in the first half of 2015, an increase of more than an order of magnitude from baseline. There was no change over time in the low rates of data sharing among comparison journals. Moreover, reporting openness does not guarantee openness. When badges were earned, reportedly available data were more likely to be actually available, correct, usable, and complete than when badges were not earned. Open materials also increased to a weaker degree, and there was more variability among comparison journals. Badges are simple, effective signals to promote open practices and improve preservation of data and materials by using independent repositories.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The poor availability of psychological research data for reanalysis.

                Bookmark

                Author and article information

                Journal
                R Soc Open Sci
                R Soc Open Sci
                RSOS
                royopensci
                Royal Society Open Science
                The Royal Society
                2054-5703
                August 2018
                15 August 2018
                15 August 2018
                : 5
                : 8
                : 180448
                Affiliations
                [1 ]Meta-Research Innovation Center at Stanford (METRICS), Stanford University , Palo Alto, CA, USA
                [2 ]Quantitative Sciences Unit, Stanford University , Palo Alto, CA, USA
                [3 ]Department of Psychology, Stanford University , Palo Alto, CA, USA
                [4 ]Harvard Biostatistics, Harvard University , Cambridge, MA, USA
                [5 ]Stress Research Institute, Stockholm University , Stockholm, Sweden
                [6 ]Department of Clinical Neuroscience, Karolinska Institutet , Stockholm, Sweden
                [7 ]Belk College of Business, University of North Carolina at Charlotte , Charlotte, NC, USA
                [8 ]Department of Psychology, University of Utah , Salt Lake City, UT, USA
                [9 ]Liberal Arts Technologies and Innovated Services (LATIS), University of Minnesota , Minneapolis, MN, USA
                [10 ]The Organizational Science Program, University of North Carolina at Charlotte , Charlotte, NC, USA
                [11 ]Department of Psychology, University of Minnesota , Minneapolis, MN, USA
                Author notes
                Author for correspondence: Tom E. Hardwicke e-mail: tom.hardwicke@ 123456stanford.edu

                Electronic supplementary material is available online at https://dx.doi.org/10.6084/m9.figshare.c.4175039.

                Author information
                http://orcid.org/0000-0001-9485-4952
                http://orcid.org/0000-0001-5273-0150
                http://orcid.org/0000-0002-7895-6390
                Article
                rsos180448
                10.1098/rsos.180448
                6124055
                30225032
                f0dbb01e-f884-447f-b9f4-c7539d1d63ef
                © 2018 The Authors.

                Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.

                History
                : 19 March 2018
                : 25 June 2018
                Funding
                Funded by: Laura and John Arnold Foundation, http://dx.doi.org/10.13039/100009827;
                Funded by: National Science Foundation, http://dx.doi.org/10.13039/100000001;
                Award ID: 1714726
                Categories
                1001
                205
                Psychology and Cognitive Neuroscience
                Research Article
                Custom metadata
                August, 2018

                open data,reproducibility,open science,meta-science,interrupted time series,journal policy

                Comments

                Comment on this article