1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found
      Is Open Access

      Multi-Class Classification of Lung Diseases Using CNN Models

      , , , , ,
      Applied Sciences
      MDPI AG

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In this study, we propose a multi-class classification method by learning lung disease images with Convolutional Neural Network (CNN). As the image data for learning, the U.S. National Institutes of Health (NIH) dataset divided into Normal, Pneumonia, and Pneumothorax and the Cheonan Soonchunhyang University Hospital dataset including Tuberculosis were used. To improve performance, preprocessing was performed with Center Crop while maintaining the aspect ratio of 1:1. As a Noisy Student of EfficientNet B7, fine-tuning learning was performed using the weights learned from ImageNet, and the features of each layer were maximally utilized using the Multi GAP structure. As a result of the experiment, Benchmarks measured with the NIH dataset showed the highest performance among the tested models with an accuracy of 85.32%, and the four-class predictions measured with data from Soonchunhyang University Hospital in Cheonan had an average accuracy of 96.1%, an average sensitivity of 92.2%, an average specificity of 97.4%, and an average inference time of 0.2 s.

          Related collections

          Most cited references15

          • Record: found
          • Abstract: not found
          • Article: not found

          ImageNet classification with deep convolutional neural networks

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            A fast learning algorithm for deep belief nets.

            We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that fine-tunes the weights using a contrastive version of the wake-sleep algorithm. After fine-tuning, a network with three hidden layers forms a very good generative model of the joint distribution of handwritten digit images and their labels. This generative model gives better digit classification than the best discriminative learning algorithms. The low-dimensional manifolds on which the digits lie are modeled by long ravines in the free-energy landscape of the top-level associative memory, and it is easy to explore these ravines by using the directed connections to display what the associative memory has in mind.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Backpropagation Applied to Handwritten Zip Code Recognition

                Bookmark

                Author and article information

                Contributors
                Journal
                ASPCC7
                Applied Sciences
                Applied Sciences
                MDPI AG
                2076-3417
                October 2021
                October 06 2021
                : 11
                : 19
                : 9289
                Article
                10.3390/app11199289
                15a4da99-be9e-45d6-80a8-829b349f9827
                © 2021

                https://creativecommons.org/licenses/by/4.0/

                Product
                Self URI (article page): https://www.mdpi.com/2076-3417/11/19/9289

                Comments

                Comment on this article