1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Conditional generative adversarial network driven radiomic prediction of mutation status based on magnetic resonance imaging of breast cancer

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Breast Cancer (BC) is a highly heterogeneous and complex disease. Personalized treatment options require the integration of multi-omic data and consideration of phenotypic variability. Radiogenomics aims to merge medical images with genomic measurements but encounter challenges due to unpaired data consisting of imaging, genomic, or clinical outcome data. In this study, we propose the utilization of a well-trained conditional generative adversarial network (cGAN) to address the unpaired data issue in radiogenomic analysis of BC. The generated images will then be used to predict the mutations status of key driver genes and BC subtypes.

          Methods

          We integrated the paired MRI and multi-omic (mRNA gene expression, DNA methylation, and copy number variation) profiles of 61 BC patients from The Cancer Imaging Archive (TCIA) and The Cancer Genome Atlas (TCGA). To facilitate this integration, we employed a Bayesian Tensor Factorization approach to factorize the multi-omic data into 17 latent features. Subsequently, a cGAN model was trained based on the matched side-view patient MRIs and their corresponding latent features to predict MRIs for BC patients who lack MRIs. Model performance was evaluated by calculating the distance between real and generated images using the Fréchet Inception Distance (FID) metric. BC subtype and mutation status of driver genes were obtained from the cBioPortal platform, where 3 genes were selected based on the number of mutated patients. A convolutional neural network (CNN) was constructed and trained using the generated MRIs for mutation status prediction. Receiver operating characteristic area under curve (ROC-AUC) and precision-recall area under curve (PR-AUC) were used to evaluate the performance of the CNN models for mutation status prediction. Precision, recall and F1 score were used to evaluate the performance of the CNN model in subtype classification.

          Results

          The FID of the images from the well-trained cGAN model based on the test set is 1.31. The CNN for TP53, PIK3CA, and CDH1 mutation prediction yielded ROC-AUC values 0.9508, 0.7515, and 0.8136 and PR-AUC are 0.9009, 0.7184, and 0.5007, respectively for the three genes. Multi-class subtype prediction achieved precision, recall and F1 scores of 0.8444, 0.8435 and 0.8336 respectively. The source code and related data implemented the algorithms can be found in the project GitHub at https://github.com/mattthuang/BC_RadiogenomicGAN.

          Conclusion

          Our study establishes cGAN as a viable tool for generating synthetic BC MRIs for mutation status prediction and subtype classification to better characterize the heterogeneity of BC in patients. The synthetic images also have the potential to significantly augment existing MRI data and circumvent issues surrounding data sharing and patient privacy for future BC machine learning studies.

          Supplementary Information

          The online version contains supplementary material available at 10.1186/s12967-024-05018-9.

          Related collections

          Most cited references45

          • Record: found
          • Abstract: found
          • Article: not found

          Global cancer statistics 2020: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries

          This article provides an update on the global cancer burden using the GLOBOCAN 2020 estimates of cancer incidence and mortality produced by the International Agency for Research on Cancer. Worldwide, an estimated 19.3 million new cancer cases (18.1 million excluding nonmelanoma skin cancer) and almost 10.0 million cancer deaths (9.9 million excluding nonmelanoma skin cancer) occurred in 2020. Female breast cancer has surpassed lung cancer as the most commonly diagnosed cancer, with an estimated 2.3 million new cases (11.7%), followed by lung (11.4%), colorectal (10.0 %), prostate (7.3%), and stomach (5.6%) cancers. Lung cancer remained the leading cause of cancer death, with an estimated 1.8 million deaths (18%), followed by colorectal (9.4%), liver (8.3%), stomach (7.7%), and female breast (6.9%) cancers. Overall incidence was from 2-fold to 3-fold higher in transitioned versus transitioning countries for both sexes, whereas mortality varied <2-fold for men and little for women. Death rates for female breast and cervical cancers, however, were considerably higher in transitioning versus transitioned countries (15.0 vs 12.8 per 100,000 and 12.4 vs 5.2 per 100,000, respectively). The global cancer burden is expected to be 28.4 million cases in 2040, a 47% rise from 2020, with a larger increase in transitioning (64% to 95%) versus transitioned (32% to 56%) countries due to demographic changes, although this may be further exacerbated by increasing risk factors associated with globalization and a growing economy. Efforts to build a sustainable infrastructure for the dissemination of cancer prevention measures and provision of cancer care in transitioning countries is critical for global cancer control.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A survey on deep learning in medical image analysis

            Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared in the last year. We survey the use of deep learning for image classification, object detection, segmentation, registration, and other tasks. Concise overviews are provided of studies per application area: neuro, retinal, pulmonary, digital pathology, breast, cardiac, abdominal, musculoskeletal. We end with a summary of the current state-of-the-art, a critical discussion of open challenges and directions for future research.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Computational Radiomics System to Decode the Radiographic Phenotype

              Radiomics aims to quantify phenotypic characteristics on medical imaging through the use of automated algorithms. Radiomic artificial intelligence (AI) technology, either based on engineered hard-coded algorithms or deep learning methods, can be used to develop non-invasive imaging-based biomarkers. However, lack of standardized algorithm definitions and image processing severely hampers reproducibility and comparability of results. To address this issue, we developed PyRadiomics , a flexible open-source platform capable of extracting a large panel of engineered features from medical images. PyRadiomics is implemented in Python and can be used standalone or using 3D-Slicer. Here, we discuss the workflow and architecture of PyRadiomics and demonstrate its application in characterizing lung-lesions. Source code, documentation, and examples are publicly available at www.radiomics.io . With this platform, we aim to establish a reference standard for radiomic analyses, provide a tested and maintained resource, and to grow the community of radiomic developers addressing critical needs in cancer research.
                Bookmark

                Author and article information

                Contributors
                qi.liu@uwinnipeg.ca
                phu49@uwo.ca
                Journal
                J Transl Med
                J Transl Med
                Journal of Translational Medicine
                BioMed Central (London )
                1479-5876
                2 March 2024
                2 March 2024
                2024
                : 22
                : 226
                Affiliations
                [1 ]Department of Biochemistry, Schulich School of Medicine & Dentistry, Western University, ( https://ror.org/02grkyz14) London, ON Canada
                [2 ]Department of Computer Science, Western University, ( https://ror.org/02grkyz14) London, ON Canada
                [3 ]Department of Applied Computer Science, University of Winnipeg, ( https://ror.org/02gdzyx04) CH Room 3C08B, 515 Portage Avenue, Winnipeg, MB R3B 2E9 Canada
                [4 ]Department of Epidemiology and Biostatistics, Schulich School of Medicine & Dentistry, Western University, ( https://ror.org/02grkyz14) London, ON Canada
                [5 ]Department of Oncology, Schulich School of Medicine & Dentistry, Western University, ( https://ror.org/02grkyz14) London, ON Canada
                [6 ]GRID grid.415847.b, ISNI 0000 0001 0556 2414, The Children’s Health Research Institute, Lawson Health Research Institute, ; London, ON Canada
                [7 ]Department of Biochemistry, Western University, Siebens Drake Research Institute, ( https://ror.org/02grkyz14) SDRI Room 201-203B, 1400 Western Road, London, ON N6G 2V4 Canada
                Author information
                http://orcid.org/0000-0002-9546-2245
                Article
                5018
                10.1186/s12967-024-05018-9
                10908206
                38429796
                b5a9d9db-d6a0-4f16-9ee7-62d9bc9f8b4d
                © The Author(s) 2024

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

                History
                : 23 July 2023
                : 22 February 2024
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/501100001804, Canada Research Chairs;
                Award ID: CRC-2021-00482
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/501100000024, Canadian Institutes of Health Research;
                Award ID: PLL 185683
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/501100000038, Natural Sciences and Engineering Research Council of Canada;
                Award ID: RGPIN-2021-04072
                Award Recipient :
                Funded by: FundRef http://dx.doi.org/10.13039/100012172, Breast Cancer Canada;
                Categories
                Research
                Custom metadata
                © BioMed Central Ltd., part of Springer Nature 2024

                Medicine
                breast cancer,cgans,radiogenomics,machine learning,magnetic resonance images
                Medicine
                breast cancer, cgans, radiogenomics, machine learning, magnetic resonance images

                Comments

                Comment on this article