5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      An analysis of key indicators of reproducibility in radiology

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Given the central role of radiology in patient care, it is important that radiological research is grounded in reproducible science. It is unclear whether there is a lack of reproducibility or transparency in radiologic research.

          Purpose

          To analyze published radiology literature for the presence or lack of key indicators of reproducibility.

          Methods

          This cross-sectional retrospective study was performed by conducting a search of the National Library of Medicine (NLM) for publications contained within journals in the field of radiology. Our inclusion criteria were being MEDLINE indexed, written in English, and published from January 1, 2014, to December 31, 2018. We randomly sampled 300 publications for this study. A pilot-tested Google form was used to record information from the publications regarding indicators of reproducibility. Following peer-review, we extracted data from an additional 200 publications in an attempt to reproduce our initial results. The additional 200 publications were selected from the list of initially randomized publications.

          Results

          Our initial search returned 295,543 records, from which 300 were randomly selected for analysis. Of these 300 records, 294 met inclusion criteria and 6 did not. Among the empirical publications, 5.6% (11/195, [3.0–8.3]) contained a data availability statement, 0.51% (1/195) provided clear documented raw data, 12.0% (23/191, [8.4–15.7]) provided a materials availability statement, 0% provided analysis scripts, 4.1% (8/195, [1.9–6.3]) provided a pre-registration statement, 2.1% (4/195, [0.4–3.7]) provided a protocol statement, and 3.6% (7/195, [1.5–5.7]) were pre-registered. The validation study of the 5 key indicators of reproducibility—availability of data, materials, protocols, analysis scripts, and pre-registration—resulted in 2 indicators (availability of protocols and analysis scripts) being reproduced, as they fell within the 95% confidence intervals for the proportions from the original sample. However, materials’ availability and pre-registration proportions from the validation sample were lower than what was found in the original sample.

          Conclusion

          Our findings demonstrate key indicators of reproducibility are missing in the field of radiology. Thus, the ability to reproduce studies contained in radiology publications may be problematic and may have potential clinical implications.

          Related collections

          Most cited references16

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Repeatability and Reproducibility of Radiomic Features: A Systematic Review

          Purpose: An ever-growing number of predictive models used to inform clinical decision making have included quantitative, computer-extracted imaging biomarkers, or “radiomic features.” Broadly generalizable validity of radiomics-assisted models may be impeded by concerns about reproducibility. We offer a qualitative synthesis of 41 studies that specifically investigated the repeatability and reproducibility of radiomic features, derived from a systematic review of published peer-reviewed literature. Methods and Materials: The PubMed electronic database was searched using combinations of the broad Haynes and Ingui filters along with a set of text words specific to cancer, radiomics (including texture analyses), reproducibility, and repeatability. This review has been reported in compliance with Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. From each full-text article, information was extracted regarding cancer type, class of radiomic feature examined, reporting quality of key processing steps, and statistical metric used to segregate stable features. Results: Among 624 unique records, 41 full-text articles were subjected to review. The studies primarily addressed non-small cell lung cancer and oropharyngeal cancer. Only 7 studies addressed in detail every methodologic aspect related to image acquisition, preprocessing, and feature extraction. The repeatability and reproducibility of radiomic features are sensitive at various degrees to processing details such as image acquisition settings, image reconstruction algorithm, digital image preprocessing, and software used to extract radiomic features. First-order features were overall more reproducible than shape metrics and textural features. Entropy was consistently reported as one of the most stable first-order features. There was no emergent consensus regarding either shape metrics or textural features; however, coarseness and contrast appeared among the least reproducible. Conclusions: Investigations of feature repeatability and reproducibility are currently limited to a small number of cancer types. Reporting quality could be improved regarding details of feature extraction software, digital image manipulation (preprocessing), and the cutoff value used to distinguish stable features.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Guidelines for reporting meta-epidemiological methodology research

            Published research should be reported to evidence users with clarity and transparency that facilitate optimal appraisal and use of evidence and allow replication by other researchers. Guidelines for such reporting are available for several types of studies but not for meta-epidemiological methodology studies. Meta-epidemiological studies adopt a systematic review or meta-analysis approach to examine the impact of certain characteristics of clinical studies on the observed effect and provide empirical evidence for hypothesised associations. The unit of analysis in meta-epidemiological studies is a study, not a patient. The outcomes of meta-epidemiological studies are usually not clinical outcomes. In this guideline, we adapt items from the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) to fit the context of meta-epidemiological studies.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              On the Plurality of (Methodological) Worlds: Estimating the Analytic Flexibility of fMRI Experiments

              How likely are published findings in the functional neuroimaging literature to be false? According to a recent mathematical model, the potential for false positives increases with the flexibility of analysis methods. Functional MRI (fMRI) experiments can be analyzed using a large number of commonly used tools, with little consensus on how, when, or whether to apply each one. This situation may lead to substantial variability in analysis outcomes. Thus, the present study sought to estimate the flexibility of neuroimaging analysis by submitting a single event-related fMRI experiment to a large number of unique analysis procedures. Ten analysis steps for which multiple strategies appear in the literature were identified, and two to four strategies were enumerated for each step. Considering all possible combinations of these strategies yielded 6,912 unique analysis pipelines. Activation maps from each pipeline were corrected for multiple comparisons using five thresholding approaches, yielding 34,560 significance maps. While some outcomes were relatively consistent across pipelines, others showed substantial methods-related variability in activation strength, location, and extent. Some analysis decisions contributed to this variability more than others, and different decisions were associated with distinct patterns of variability across the brain. Qualitative outcomes also varied with analysis parameters: many contrasts yielded significant activation under some pipelines but not others. Altogether, these results reveal considerable flexibility in the analysis of fMRI experiments. This observation, when combined with mathematical simulations linking analytic flexibility with elevated false positive rates, suggests that false positive results may be more prevalent than expected in the literature. This risk of inflated false positive rates may be mitigated by constraining the flexibility of analytic choices or by abstaining from selective analysis reporting.
                Bookmark

                Author and article information

                Contributors
                bdwrigh@okstate.edu
                Journal
                Insights Imaging
                Insights Imaging
                Insights into Imaging
                Springer Berlin Heidelberg (Berlin/Heidelberg )
                1869-4101
                11 May 2020
                11 May 2020
                December 2020
                : 11
                : 65
                Affiliations
                [1 ]GRID grid.261367.7, ISNI 0000 0004 0542 825X, Oklahoma State University Center for Health Sciences, ; 1111 W 17th St, Tulsa, OK 74107 USA
                [2 ]GRID grid.258405.e, ISNI 0000 0004 0539 5056, Kansas City University of Medicine and Biosciences, ; Joplin, MO USA
                [3 ]GRID grid.267308.8, ISNI 0000 0000 9206 2401, Department of Diagnostic and Interventional Imaging, , The University of Texas Health Sciences Center at Houston, ; Houston, TX USA
                Article
                870
                10.1186/s13244-020-00870-x
                7214585
                32394098
                13b6ea92-9857-44de-906c-840ef8d88108
                © The Author(s) 2020

                Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 4 October 2019
                : 2 April 2020
                Funding
                Funded by: This study was funded through the 2019 Presidential Research Fellowship Mentor – Mentee Program at Oklahoma State University Center for Health Sciences.
                Categories
                Original Article
                Custom metadata
                © The Author(s) 2020

                Radiology & Imaging
                meta-analysis,reproducibility of results,radiology,transparency
                Radiology & Imaging
                meta-analysis, reproducibility of results, radiology, transparency

                Comments

                Comment on this article