2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Short-Term Traffic Flow Prediction Based on CNN-BILSTM with Multicomponent Information

      ,
      Applied Sciences
      MDPI AG

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Problem definition: The intelligent transportation system (ITS) plays a vital role in the construction of smart cities. For the past few years, traffic flow prediction has been a hot study topic in the field of transportation. Facing the rapid increase in the amount of traffic information, finding out how to use dynamic traffic information to accurately predict its flow has become a challenge. Methodology: Thus, to figure out this issue, this study put forward a multistep prediction model based on a convolutional neural network and bidirectional long short-term memory (BILSTM) model. The spatial characteristics of traffic data were considered as input of the BILSTM model to extract the time series characteristics of the traffic. Results: The experimental results validated that the BILSTM model improved the prediction accuracy in comparison to the support vector regression and gated recurring unit models. Furthermore, the proposed model was comparatively analyzed in terms of mean absolute error, mean absolute percentage error, and root mean square error, which were reduced by 30.4%, 32.2%, and 39.6%, respectively. Managerial implications: Our study provides useful insights into predicting the short-term traffic flow on highways and will improve the management of traffic flow optimization.

          Related collections

          Most cited references30

          • Record: found
          • Abstract: found
          • Article: not found

          A fast learning algorithm for deep belief nets.

          We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that fine-tunes the weights using a contrastive version of the wake-sleep algorithm. After fine-tuning, a network with three hidden layers forms a very good generative model of the joint distribution of handwritten digit images and their labels. This generative model gives better digit classification than the best discriminative learning algorithms. The low-dimensional manifolds on which the digits lie are modeled by long ravines in the free-energy landscape of the top-level associative memory, and it is easy to explore these ravines by using the directed connections to display what the associative memory has in mind.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Modeling and Forecasting Vehicular Traffic Flow as a Seasonal ARIMA Process: Theoretical Basis and Empirical Results

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Deep Architecture for Traffic Flow Prediction: Deep Belief Networks With Multitask Learning

                Bookmark

                Author and article information

                Journal
                ASPCC7
                Applied Sciences
                Applied Sciences
                MDPI AG
                2076-3417
                September 2022
                August 30 2022
                : 12
                : 17
                : 8714
                Article
                10.3390/app12178714
                b1301e23-7e78-4783-8a46-d9f23d2ccf0e
                © 2022

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article