34
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Deep learning algorithm for the automated detection and classification of nasal cavity mass in nasal endoscopic images

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Nasal endoscopy is routinely performed to distinguish the pathological types of masses. There is a lack of studies on deep learning algorithms for discriminating a wide range of endoscopic nasal cavity mass lesions. Therefore, we aimed to develop an endoscopic-examination-based deep learning model to detect and classify nasal cavity mass lesions, including nasal polyps (NPs), benign tumors, and malignant tumors. The clinical feasibility of the model was evaluated by comparing the results to those of manual assessment. Biopsy-confirmed nasal endoscopic images were obtained from 17 hospitals in South Korea. Here, 400 images were used for the test set. The training and validation datasets consisted of 149,043 normal nasal cavity, 311,043 NP, 9,271 benign tumor, and 5,323 malignant tumor lesion images. The proposed Xception architecture achieved an overall accuracy of 0.792 with the following class accuracies on the test set: normal = 0.978 ± 0.016, NP = 0.790 ± 0.016, benign = 0.708 ± 0.100, and malignant = 0.698 ± 0.116. With an average area under the receiver operating characteristic curve (AUC) of 0.947, the AUC values and F1 score were highest in the order of normal, NP, malignant tumor, and benign tumor classes. The classification performances of the proposed model were comparable with those of manual assessment in the normal and NP classes. The proposed model outperformed manual assessment in the benign and malignant tumor classes (sensitivities of 0.708 ± 0.100 vs. 0.549 ± 0.172, 0.698 ± 0.116 vs. 0.518 ± 0.153, respectively). In urgent (malignant) versus nonurgent binary predictions, the deep learning model achieved superior diagnostic accuracy. The developed model based on endoscopic images achieved satisfactory performance in classifying four classes of nasal cavity mass lesions, namely normal, NP, benign tumor, and malignant tumor. The developed model can therefore be used to screen nasal cavity lesions accurately and rapidly.

          Related collections

          Most cited references43

          • Record: found
          • Abstract: not found
          • Article: not found

          ImageNet Large Scale Visual Recognition Challenge

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Dermatologist-level classification of skin cancer with deep neural networks

            Skin cancer, the most common human malignancy, is primarily diagnosed visually, beginning with an initial clinical screening and followed potentially by dermoscopic analysis, a biopsy and histopathological examination. Automated classification of skin lesions using images is a challenging task owing to the fine-grained variability in the appearance of skin lesions. Deep convolutional neural networks (CNNs) show potential for general and highly variable tasks across many fine-grained object categories. Here we demonstrate classification of skin lesions using a single CNN, trained end-to-end from images directly, using only pixels and disease labels as inputs. We train a CNN using a dataset of 129,450 clinical images—two orders of magnitude larger than previous datasets—consisting of 2,032 different diseases. We test its performance against 21 board-certified dermatologists on biopsy-proven clinical images with two critical binary classification use cases: keratinocyte carcinomas versus benign seborrheic keratoses; and malignant melanomas versus benign nevi. The first case represents the identification of the most common cancers, the second represents the identification of the deadliest skin cancer. The CNN achieves performance on par with all tested experts across both tasks, demonstrating an artificial intelligence capable of classifying skin cancer with a level of competence comparable to dermatologists. Outfitted with deep neural networks, mobile devices can potentially extend the reach of dermatologists outside of the clinic. It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021 (ref. 13) and can therefore potentially provide low-cost universal access to vital diagnostic care.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A survey on deep learning in medical image analysis

              Deep learning algorithms, in particular convolutional networks, have rapidly become a methodology of choice for analyzing medical images. This paper reviews the major deep learning concepts pertinent to medical image analysis and summarizes over 300 contributions to the field, most of which appeared in the last year. We survey the use of deep learning for image classification, object detection, segmentation, registration, and other tasks. Concise overviews are provided of studies per application area: neuro, retinal, pulmonary, digital pathology, breast, cardiac, abdominal, musculoskeletal. We end with a summary of the current state-of-the-art, a critical discussion of open challenges and directions for future research.
                Bookmark

                Author and article information

                Contributors
                Role: Data curationRole: Formal analysisRole: VisualizationRole: Writing – original draft
                Role: Data curationRole: Formal analysisRole: Software
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Resources
                Role: Formal analysisRole: SupervisionRole: Writing – review & editing
                Role: Funding acquisitionRole: Project administration
                Role: ConceptualizationRole: Project administrationRole: SupervisionRole: Writing – review & editing
                Role: Editor
                Journal
                PLoS One
                PLoS One
                plos
                PLOS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                13 March 2024
                2024
                : 19
                : 3
                : e0297536
                Affiliations
                [1 ] Department of Otolaryngology, Samsung Changwon Hospital, Sungkyunkwan University School of Medicine, Changwon, Korea
                [2 ] Department of Biomedical Informatics, College of Medicine, Konyang University, Daejeon, Korea
                [3 ] Department of Otolaryngology-Head and Neck Surgery, Chonnam National University Medical School & Hwasun Hospital, Hwasun, Korea
                [4 ] Department of Otorhinolaryngology-Head and Neck Surgery, Seoul National University Hospital, Seoul, Korea
                [5 ] Department of Otorhinolaryngology-Head and Neck Surgery, Korea University Guro Hospital, Korea University College of Medicine, Seoul, Korea
                [6 ] Department of Otorhinolaryngology, Gyeongsang National University School of Medicine and Gyeongsang National University Hospital, Jinju, Korea
                [7 ] Department of Otorhinolaryngology-Head and Neck Surgery, College of Medicine, Jeonbuk National University, Jeonju, Korea
                [8 ] Department of Otolaryngology-Head and Neck Surgery, Kosin University College of Medicine, Busan, Korea
                [9 ] Department of Otorhinolaryngology–Head and Neck Surgery, Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul, Korea
                [10 ] Department of Otolaryngology-Head and Neck Surgery, Soonchunhyang University Seoul Hospital, Soonchunhyang University College of Medicine, Seoul, Korea
                [11 ] Department of Otorhinolaryngology-Head and Neck Surgery, Dankook University College of Medicine, Cheonan, Korea
                [12 ] Department of Otorhinolaryngology-Head and Neck Surgery, Chungnam National University Sejong Hospital, College of Medicine, Sejong, Korea
                [13 ] Department of Otorhinolaryngology-Head and Neck Surgery, School of Medicine, Kyungpook National University Chilgok Hospital, Kyungpook National University, Daegu, Korea
                [14 ] Department of Otorhinolaryngology-Head and Neck Surgery, National Medical Center, Seoul, Korea
                [15 ] Department of Otorhinolaryngology‒Head and Neck Surgery, Seoul National University Bundang Hospital, Seongnam, Korea
                [16 ] Department of Otorhinolaryngology–Head and Neck Surgery, College of Medicine, Chungnam National University, Daejeon, Korea
                [17 ] Department of Otorhinolaryngology, Inje University Haeundae Paik Hospital, Busan, Korea
                [18 ] Department of Otorhinolaryngology-Head and Neck Surgery, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea
                University of Catania, ITALY
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                Author information
                https://orcid.org/0000-0002-5083-8720
                https://orcid.org/0000-0003-1230-9307
                https://orcid.org/0000-0002-4529-0254
                Article
                PONE-D-23-00471
                10.1371/journal.pone.0297536
                10936791
                38478548
                273467d3-a4c9-46c3-9561-ba87250b27c6
                © 2024 Kwon et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 10 January 2023
                : 5 December 2023
                Page count
                Figures: 7, Tables: 2, Pages: 18
                Funding
                Funded by: Korean Rhinologic Society
                Award Recipient :
                This paper was supported by the fund of the Korean Rhinologic Society in 2022.
                Categories
                Research Article
                Biology and Life Sciences
                Anatomy
                Respiratory System
                Nasal Cavity
                Medicine and Health Sciences
                Anatomy
                Respiratory System
                Nasal Cavity
                Medicine and Health Sciences
                Surgical and Invasive Medical Procedures
                Endoscopy
                Medicine and Health Sciences
                Oncology
                Cancers and Neoplasms
                Malignant Tumors
                Medicine and Health Sciences
                Clinical Medicine
                Signs and Symptoms
                Lesions
                Computer and Information Sciences
                Artificial Intelligence
                Machine Learning
                Deep Learning
                Medicine and Health Sciences
                Diagnostic Medicine
                Cancer Detection and Diagnosis
                Medicine and Health Sciences
                Oncology
                Cancer Detection and Diagnosis
                Medicine and Health Sciences
                Oncology
                Cancers and Neoplasms
                Benign Tumors
                Medicine and Health Sciences
                Oncology
                Cancers and Neoplasms
                Head and Neck Cancers
                Custom metadata
                All code and endoscopic image samples are accessible on GitHub ( https://github.com/shpark5779/SNT_Research/tree/main).

                Uncategorized
                Uncategorized

                Comments

                Comment on this article