2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Effective Melanoma Recognition Using Deep Convolutional Neural Network with Covariance Discriminant Loss

      letter

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Melanoma recognition is challenging due to data imbalance and high intra-class variations and large inter-class similarity. Aiming at the issues, we propose a melanoma recognition method using deep convolutional neural network with covariance discriminant loss in dermoscopy images. Deep convolutional neural network is trained under the joint supervision of cross entropy loss and covariance discriminant loss, rectifying the model outputs and the extracted features simultaneously. Specifically, we design an embedding loss, namely covariance discriminant loss, which takes the first and second distance into account simultaneously for providing more constraints. By constraining the distance between hard samples and minority class center, the deep features of melanoma and non-melanoma can be separated effectively. To mine the hard samples, we also design the corresponding algorithm. Further, we analyze the relationship between the proposed loss and other losses. On the International Symposium on Biomedical Imaging (ISBI) 2018 Skin Lesion Analysis dataset, the two schemes in the proposed method can yield a sensitivity of 0.942 and 0.917, respectively. The comprehensive results have demonstrated the efficacy of the designed embedding loss and the proposed methodology.

          Related collections

          Most cited references48

          • Record: found
          • Abstract: found
          • Article: not found

          Dermatologist-level classification of skin cancer with deep neural networks

          Skin cancer, the most common human malignancy, is primarily diagnosed visually, beginning with an initial clinical screening and followed potentially by dermoscopic analysis, a biopsy and histopathological examination. Automated classification of skin lesions using images is a challenging task owing to the fine-grained variability in the appearance of skin lesions. Deep convolutional neural networks (CNNs) show potential for general and highly variable tasks across many fine-grained object categories. Here we demonstrate classification of skin lesions using a single CNN, trained end-to-end from images directly, using only pixels and disease labels as inputs. We train a CNN using a dataset of 129,450 clinical images—two orders of magnitude larger than previous datasets—consisting of 2,032 different diseases. We test its performance against 21 board-certified dermatologists on biopsy-proven clinical images with two critical binary classification use cases: keratinocyte carcinomas versus benign seborrheic keratoses; and malignant melanomas versus benign nevi. The first case represents the identification of the most common cancers, the second represents the identification of the deadliest skin cancer. The CNN achieves performance on par with all tested experts across both tasks, demonstrating an artificial intelligence capable of classifying skin cancer with a level of competence comparable to dermatologists. Outfitted with deep neural networks, mobile devices can potentially extend the reach of dermatologists outside of the clinic. It is projected that 6.3 billion smartphone subscriptions will exist by the year 2021 (ref. 13) and can therefore potentially provide low-cost universal access to vital diagnostic care.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Representation learning: a review and new perspectives.

            The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data. Although specific domain knowledge can be used to help design representations, learning with generic priors can also be used, and the quest for AI is motivating the design of more powerful representation-learning algorithms implementing such priors. This paper reviews recent work in the area of unsupervised feature learning and deep learning, covering advances in probabilistic models, autoencoders, manifold learning, and deep networks. This motivates longer term unanswered questions about the appropriate objectives for learning good representations, for computing representations (i.e., inference), and the geometrical connections between representation learning, density estimation, and manifold learning.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Colorectal cancer statistics, 2017.

              Colorectal cancer (CRC) is one of the most common malignancies in the United States. Every 3 years, the American Cancer Society provides an update of CRC incidence, survival, and mortality rates and trends. Incidence data through 2013 were provided by the Surveillance, Epidemiology, and End Results program, the National Program of Cancer Registries, and the North American Association of Central Cancer Registries. Mortality data through 2014 were provided by the National Center for Health Statistics. CRC incidence rates are highest in Alaska Natives and blacks and lowest in Asian/Pacific Islanders, and they are 30% to 40% higher in men than in women. Recent temporal patterns are generally similar by race and sex, but differ by age. Between 2000 and 2013, incidence rates in adults aged ≥50 years declined by 32%, with the drop largest for distal tumors in people aged ≥65 years (incidence rate ratio [IRR], 0.50; 95% confidence interval [95% CI], 0.48-0.52) and smallest for rectal tumors in ages 50 to 64 years (male IRR, 0.91; 95% CI, 0.85-0.96; female IRR, 1.00; 95% CI, 0.93-1.08). Overall CRC incidence in individuals ages ≥50 years declined from 2009 to 2013 in every state except Arkansas, with the decrease exceeding 5% annually in 7 states; however, rectal tumor incidence in those ages 50 to 64 years was stable in most states. Among adults aged <50 years, CRC incidence rates increased by 22% from 2000 to 2013, driven solely by tumors in the distal colon (IRR, 1.24; 95% CI, 1.13-1.35) and rectum (IRR, 1.22; 95% CI, 1.13-1.31). Similar to incidence patterns, CRC death rates decreased by 34% among individuals aged ≥50 years during 2000 through 2014, but increased by 13% in those aged <50 years. Progress against CRC can be accelerated by increasing initiation of screening at age 50 years (average risk) or earlier (eg, family history of CRC/advanced adenomas) and eliminating disparities in high-quality treatment. In addition, research is needed to elucidate causes for increasing CRC in young adults. CA Cancer J Clin 2017. © 2017 American Cancer Society. CA Cancer J Clin 2017;67:177-193. © 2017 American Cancer Society.
                Bookmark

                Author and article information

                Journal
                Sensors (Basel)
                Sensors (Basel)
                sensors
                Sensors (Basel, Switzerland)
                MDPI
                1424-8220
                13 October 2020
                October 2020
                : 20
                : 20
                : 5786
                Affiliations
                [1 ]College of Information and Computer, Taiyuan University of Technology, Taiyuan 030024, China; guolei0036@ 123456link.tyut.edu.cn
                [2 ]College of Electrical and Power Engineering, Taiyuan University of Technology, Taiyuan 030024, China; xuxinying@ 123456tyut.edu.cn
                [3 ]Shanxi Key Laboratory of Advanced Control and Intelligent Information System, School of Electronic Information Engineering, Taiyuan University of Science and Technology, Taiyuan 030024, China
                [4 ]Department of Electronic and Electrical Engineering, University of Strathclyde, Glasgow G1 1XW, UK
                Author notes
                Author information
                https://orcid.org/0000-0003-1559-6022
                https://orcid.org/0000-0001-6116-3194
                Article
                sensors-20-05786
                10.3390/s20205786
                7601957
                33066123
                221ed3ad-06ed-4122-9fa0-f0fe225edb30
                © 2020 by the authors.

                Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( http://creativecommons.org/licenses/by/4.0/).

                History
                : 20 August 2020
                : 09 October 2020
                Categories
                Letter

                Biomedical engineering
                melanoma recognition,embedding loss,covariance discriminant loss,deep convolutional neural network,dermoscopy image

                Comments

                Comment on this article