28
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Improving open and rigorous science: ten key future research opportunities related to rigor, reproducibility, and transparency in scientific research

      discussion

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background: As part of a coordinated effort to expand research activity around rigor, reproducibility, and transparency (RRT) across scientific disciplines, a team of investigators at the Indiana University School of Public Health-Bloomington hosted a workshop in October 2019 with international leaders to discuss key opportunities for RRT research.

          Objective: The workshop aimed to identify research priorities and opportunities related to RRT.

          Design: Over two-days, workshop attendees gave presentations and participated in three working groups: (1) Improving Education & Training in RRT, (2) Reducing Statistical Errors and Increasing Analytic Transparency, and (3) Looking Outward: Increasing Truthfulness and Accuracy of Research Communications. Following small-group discussions, the working groups presented their findings, and participants discussed the research opportunities identified. The investigators compiled a list of research priorities, which were circulated to all participants for feedback.

          Results: Participants identified the following priority research questions: (1) Can RRT-focused statistics and mathematical modeling courses improve statistics practice?; (2) Can specialized training in scientific writing improve transparency?; (3) Does modality (e.g. face to face, online) affect the efficacy RRT-related education?; (4) How can automated programs help identify errors more efficiently?; (5) What is the prevalence and impact of errors in scientific publications (e.g., analytic inconsistencies, statistical errors, and other objective errors)?; (6) Do error prevention workflows reduce errors?; (7) How do we encourage post-publication error correction?; (8) How does ‘spin’ in research communication affect stakeholder understanding and use of research evidence?; (9) Do tools to aid writing research reports increase comprehensiveness and clarity of research reports?; and (10) Is it possible to inculcate scientific values and norms related to truthful, rigorous, accurate, and comprehensive scientific reporting?

          Conclusion: Participants identified important and relatively unexplored questions related to improving RRT. This list may be useful to the scientific community and investigators seeking to advance meta-science (i.e. research on research).

          Related collections

          Most cited references88

          • Record: found
          • Abstract: not found
          • Article: not found

          Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            PSYCHOLOGY. Estimating the reproducibility of psychological science.

            Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              1,500 scientists lift the lid on reproducibility.

                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Writing – Original Draft Preparation
                Role: Writing – Original Draft PreparationRole: Writing – Review & Editing
                Role: Funding AcquisitionRole: Writing – Review & Editing
                Role: ConceptualizationRole: Funding AcquisitionRole: Writing – Review & Editing
                Role: Funding AcquisitionRole: Project Administration
                Role: Writing – Review & Editing
                Role: Writing – Review & Editing
                Role: Writing – Review & Editing
                Role: Writing – Review & Editing
                Role: ConceptualizationRole: Funding AcquisitionRole: Writing – Review & Editing
                Journal
                F1000Res
                F1000Res
                F1000Research
                F1000Research
                F1000 Research Limited (London, UK )
                2046-1402
                14 October 2020
                2020
                : 9
                : 1235
                Affiliations
                [1 ]Indiana University School of Public Health, Bloomington, IN, 47403, USA
                [2 ]Project TIER, Haverford College, Haverford, Pennsylvania, 19041, USA
                [3 ]Indiana University Purdue University Indianapolis Fairbanks School of Public Health, Indianapolis, IN, 46223, USA
                [4 ]Rachel Levy, Mathematical Association of America, 1529 18th St. NW, Washington, DC, 20036, USA
                [5 ]Indiana University School of Education, Bloomington, IN, 47401, USA
                [1 ]Department of Pathology, New York University (NYU) Langone Medical Center, New York, NY, USA
                [1 ]Idaho Water Science Center, US Geological Survey, Boise, Idaho, USA
                [1 ]Office of Biodefense, Research Resources and Translational Research, Division of Microbiology and Infectious Diseases, National Institute of Allergy and Infectious Diseases, National Institutes of Health, Bethesda, MD, USA
                Author notes

                No competing interests were disclosed.

                Competing interests: No competing interests were disclosed.

                Competing interests: No competing interests were disclosed.

                Competing interests: No competing interests were disclosed.

                Author information
                https://orcid.org/0000-0002-2355-9881
                https://orcid.org/0000-0003-4225-372X
                https://orcid.org/0000-0002-1758-8205
                https://orcid.org/0000-0001-6126-2459
                https://orcid.org/0000-0003-3286-3540
                Article
                10.12688/f1000research.26594.1
                7898357
                ffb62817-f2a5-4955-9e7f-83ed0be371dd
                Copyright: © 2020 Valdez D et al.

                This is an open access article distributed under the terms of the Creative Commons Attribution Licence, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

                History
                : 2 October 2020
                Funding
                Funded by: Alfred P. Sloan Foundation
                Award ID: G-2019-11438
                This work was funded by the Alfred P. Sloan Foundation (G-2019-11438) and awarded to David B. Allison.
                The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Opinion Article
                Articles

                meta-science; science of science; rigor reproducibility and transparency (rrt); workshop;

                Comments

                Comment on this article