Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Winter Wheat Yield Prediction Using an LSTM Model from MODIS LAI Products

      , , ,
      Agriculture
      MDPI AG

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Yield estimation using remote sensing data is a research priority in modern agriculture. The rapid and accurate estimation of winter wheat yields over large areas is an important prerequisite for food security policy formulation and implementation. In most county-level yield estimation processes, multiple input data are used for yield prediction as much as possible, however, in some regions, data are more difficult to obtain, so we used the single-leaf area index (LAI) as input data for the model for yield prediction. In this study, the effects of different time steps as well as the LAI time series on the estimation results were analyzed for the properties of long short-term memory (LSTM), and multiple machine learning methods were compared with yield estimation models constructed by the LSTM networks. The results show that the accuracy of the yield estimation results using LSTM did not show an increasing trend with the increasing step size and data volume, while the yield estimation results of the LSTM were generally better than those of conventional machine learning methods, with the best R2 and RMSE results of 0.87 and 522.3 kg/ha, respectively, in the comparison between predicted and actual yields. Although the use of LAI as a single input factor may cause yield uncertainty in some extreme years, it is a reliable and promising method for improving the yield estimation, which has important implications for crop yield forecasting, agricultural disaster monitoring, food trade policy, and food security early warning.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              ImageNet classification with deep convolutional neural networks

                Bookmark

                Author and article information

                Journal
                ABSGFK
                Agriculture
                Agriculture
                MDPI AG
                2077-0472
                October 2022
                October 17 2022
                : 12
                : 10
                : 1707
                Article
                10.3390/agriculture12101707
                227e24b2-73ab-4e11-9954-ad0e17c10b42
                © 2022

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article