3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Detection of Intestinal Protozoa in Trichrome-Stained Stool Specimens by Use of a Deep Convolutional Neural Network

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          ABSTRACT

          Intestinal protozoa are responsible for relatively few infections in the developed world, but the testing volume is disproportionately high. Manual light microscopy of stool remains the gold standard but can be insensitive, time-consuming, and difficult to maintain competency. Artificial intelligence and digital slide scanning show promise for revolutionizing the clinical parasitology laboratory by augmenting the detection of parasites and slide interpretation using a convolutional neural network (CNN) model. The goal of this study was to develop a sensitive model that could screen out negative trichrome slides, while flagging potential parasites for manual confirmation. Conventional protozoa were trained as “classes” in a deep CNN. Between 1,394 and 23,566 exemplars per class were used for training, based on specimen availability, from a minimum of 10 unique slides per class. Scanning was performed using a 40× dry lens objective automated slide scanner. Data labeling was performed using a proprietary Web interface. Clinical validation of the model was performed using 10 unique positive slides per class and 125 negative slides. Accuracy was calculated as slide-level agreement (e.g., parasite present or absent) with microscopy. Positive agreement was 98.88% (95% confidence interval [CI], 93.76% to 99.98%), and negative agreement was 98.11% (95% CI, 93.35% to 99.77%). The model showed excellent reproducibility using slides containing multiple classes, a single class, or no parasites. The limit of detection of the model and scanner using serially diluted stool was 5-fold more sensitive than manual examinations by multiple parasitologists using 4 unique slide sets. Digital slide scanning and a CNN model are robust tools for augmenting the conventional detection of intestinal protozoa.

          Related collections

          Most cited references16

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis

          Pathologists face a substantial increase in workload and complexity of histopathologic cancer diagnosis due to the advent of personalized medicine. Therefore, diagnostic protocols have to focus equally on efficiency and accuracy. In this paper we introduce ‘deep learning’ as a technique to improve the objectivity and efficiency of histopathologic slide analysis. Through two examples, prostate cancer identification in biopsy specimens and breast cancer metastasis detection in sentinel lymph nodes, we show the potential of this new methodology to reduce the workload for pathologists, while at the same time increasing objectivity of diagnoses. We found that all slides containing prostate cancer and micro- and macro-metastases of breast cancer could be identified automatically while 30–40% of the slides containing benign and normal tissue could be excluded without the use of any additional immunohistochemical markers or human intervention. We conclude that ‘deep learning’ holds great promise to improve the efficacy of prostate cancer diagnosis and breast cancer staging.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found
            Is Open Access

            Deep neural networks are superior to dermatologists in melanoma image classification

            Melanoma is the most dangerous type of skin cancer but is curable if detected early. Recent publications demonstrated that artificial intelligence is capable in classifying images of benign nevi and melanoma with dermatologist-level precision. However, a statistically significant improvement compared with dermatologist classification has not been reported to date.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A semi-automatic method for quantification and classification of erythrocytes infected with malaria parasites in microscopic images.

              Visual quantification of parasitemia in thin blood films is a very tedious, subjective and time-consuming task. This study presents an original method for quantification and classification of erythrocytes in stained thin blood films infected with Plasmodium falciparum. The proposed approach is composed of three main phases: a preprocessing step, which corrects luminance differences. A segmentation step that uses the normalized RGB color space for classifying pixels either as erythrocyte or background followed by an Inclusion-Tree representation that structures the pixel information into objects, from which erythrocytes are found. Finally, a two step classification process identifies infected erythrocytes and differentiates the infection stage, using a trained bank of classifiers. Additionally, user intervention is allowed when the approach cannot make a proper decision. Four hundred fifty malaria images were used for training and evaluating the method. Automatic identification of infected erythrocytes showed a specificity of 99.7% and a sensitivity of 94%. The infection stage was determined with an average sensitivity of 78.8% and average specificity of 91.2%.
                Bookmark

                Author and article information

                Journal
                Journal of Clinical Microbiology
                J Clin Microbiol
                American Society for Microbiology
                0095-1137
                1098-660X
                May 26 2020
                May 26 2020
                April 15 2020
                : 58
                : 6
                Article
                10.1128/JCM.02053-19
                7269375
                32295888
                a8256230-e2d4-43ce-aeb5-f64ab5e9e38d
                © 2020
                History

                Comments

                Comment on this article