+1 Recommend
0 collections
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      An explainable deep machine vision framework for plant stress phenotyping

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          Plant stress identification based on visual symptoms has predominately remained a manual exercise performed by trained pathologists, primarily due to the occurrence of confounding symptoms. However, the manual rating process is tedious, is time-consuming, and suffers from inter- and intrarater variabilities. Our work resolves such issues via the concept of explainable deep machine learning to automate the process of plant stress identification, classification, and quantification. We construct a very accurate model that can not only deliver trained pathologist-level performance but can also explain which visual symptoms are used to make predictions. We demonstrate that our method is applicable to a large variety of biotic and abiotic stresses and is transferable to other imaging conditions and plants.


          Current approaches for accurate identification, classification, and quantification of biotic and abiotic stresses in crop research and production are predominantly visual and require specialized training. However, such techniques are hindered by subjectivity resulting from inter- and intrarater cognitive variability. This translates to erroneous decisions and a significant waste of resources. Here, we demonstrate a machine learning framework’s ability to identify and classify a diverse set of foliar stresses in soybean [ Glycine max (L.) Merr.] with remarkable accuracy. We also present an explanation mechanism, using the top-K high-resolution feature maps that isolate the visual symptoms used to make predictions. This unsupervised identification of visual symptoms provides a quantitative measure of stress severity, allowing for identification (type of foliar stress), classification (low, medium, or high stress), and quantification (stress severity) in a single framework without detailed symptom annotation by experts. We reliably identified and classified several biotic (bacterial and fungal diseases) and abiotic (chemical injury and nutrient deficiency) stresses by learning from over 25,000 images. The learned model is robust to input image perturbations, demonstrating viability for high-throughput deployment. We also noticed that the learned model appears to be agnostic to species, seemingly demonstrating an ability of transfer learning. The availability of an explainable model that can consistently, rapidly, and accurately identify and quantify foliar stresses would have significant implications in scientific research, plant breeding, and crop production. The trained model could be deployed in mobile platforms (e.g., unmanned air vehicles and automated ground scouts) for rapid, large-scale scouting or as a mobile application for real-time detection of stress by farmers and researchers.

          Related collections

          Most cited references 9

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Using Deep Learning for Image-Based Plant Disease Detection

          Crop diseases are a major threat to food security, but their rapid identification remains difficult in many parts of the world due to the lack of the necessary infrastructure. The combination of increasing global smartphone penetration and recent advances in computer vision made possible by deep learning has paved the way for smartphone-assisted disease diagnosis. Using a public dataset of 54,306 images of diseased and healthy plant leaves collected under controlled conditions, we train a deep convolutional neural network to identify 14 crop species and 26 diseases (or absence thereof). The trained model achieves an accuracy of 99.35% on a held-out test set, demonstrating the feasibility of this approach. Overall, the approach of training deep learning models on increasingly large and publicly available image datasets presents a clear path toward smartphone-assisted crop disease diagnosis on a massive global scale.
            • Record: found
            • Abstract: found
            • Article: not found
            Is Open Access

            Machine Learning for High-Throughput Stress Phenotyping in Plants.

            Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits.
              • Record: found
              • Abstract: not found
              • Article: not found

              Plant Disease Severity Estimated Visually, by Digital Photography and Image Analysis, and by Hyperspectral Imaging

               C. Bock,  G. Poole,  P. Parker (2010)

                Author and article information

                Proc Natl Acad Sci U S A
                Proc. Natl. Acad. Sci. U.S.A
                Proceedings of the National Academy of Sciences of the United States of America
                National Academy of Sciences
                1 May 2018
                16 April 2018
                16 April 2018
                : 115
                : 18
                : 4613-4618
                aDepartment of Mechanical Engineering, Iowa State University , Ames, IA 50011;
                bDepartment of Agronomy, Iowa State University , Ames, IA 50011
                Author notes
                2To whom correspondence may be addressed. Email: soumiks@ 123456iastate.edu or arti@ 123456iastate.edu .

                Edited by Sheng Yang He, Department of Energy, Plant Research Laboratory, Michigan State University, East Lansing, MI, and approved March 21, 2018 (received for review September 29, 2017)

                Author contributions: A.K.S., B.G., A.S., and S.S. designed research; S.G., D.B., A.K.S., B.G., A.S., and S.S. performed research; S.G., B.G., and S.S. contributed new reagents/analytic tools; S.G., D.B., A.K.S., B.G., A.S., and S.S. analyzed data; and S.G., D.B., A.K.S., B.G., A.S., and S.S. wrote the paper.

                1S.G. and D.B. contributed equally to this work.

                Copyright © 2018 the Author(s). Published by PNAS.

                This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).

                Page count
                Pages: 6
                Funded by: USDA NIFA
                Award ID: 2017-67007-26151
                Funded by: USDA CRIS
                Award ID: IOW04403
                Biological Sciences
                Agricultural Sciences
                Physical Sciences
                Computer Sciences


                Comment on this article