3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Exploring the Predictive Power of News and Neural Machine Learning Models for Economic Forecasting

      chapter-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Forecasting economic and financial variables is a challenging task for several reasons, such as the low signal-to-noise ratio, regime changes, and the effect of volatility among others. A recent trend is to extract information from news as an additional source to forecast economic activity and financial variables. The goal is to evaluate if news can improve forecasts from standard methods that usually are not well-specified and have poor out-of-sample performance. In a currently on-going project, our goal is to combine a richer information set that includes news with a state-of-the-art machine learning model. In particular, we leverage on two recent advances in Data Science, specifically on Word Embedding and Deep Learning models, which have recently attracted extensive attention in many scientific fields. We believe that by combining the two methodologies, effective solutions can be built to improve the prediction accuracy for economic and financial time series. In this preliminary contribution, we provide an overview of the methodology under development and some initial empirical findings. The forecasting model is based on DeepAR, an auto-regressive probabilistic Recurrent Neural Network model, that is combined with GloVe Word Embeddings extracted from economic news. The target variable is the spread between the US 10-Year Treasury Constant Maturity and the 3-Month Treasury Constant Maturity (T10Y3M). The DeepAR model is trained on a large number of related GloVe Word Embedding time series, and employed to produce point and density forecasts.

          Related collections

          Most cited references8

          • Record: found
          • Abstract: found
          • Article: not found

          Deep learning.

          Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Forecasting with artificial neural networks:

                Bookmark

                Author and article information

                Contributors
                valerio.bitetta@unicredit.eu
                ilaria.bordino@unicredit.eu
                andrea.ferretti2@unicredit.eu
                francesco.gullo@unicredit.eu
                giovanni.ponti@enea.it
                lorenzo.severini@unicredit.eu
                sergio.consoli@ec.europa.eu
                Journal
                978-3-030-66981-2
                10.1007/978-3-030-66981-2
                Mining Data for Financial Applications
                Mining Data for Financial Applications
                5th ECML PKDD Workshop, MIDAS 2020, Ghent, Belgium, September 18, 2020, Revised Selected Papers
                978-3-030-66980-5
                978-3-030-66981-2
                15 January 2021
                15 January 2021
                : 12591
                : 135-149
                Affiliations
                [8 ]GRID grid.436156.3, ISNI 0000 0004 1775 9187, UniCredit, ; Milan, Italy
                [9 ]GRID grid.436156.3, ISNI 0000 0004 1775 9187, UniCredit, ; Rome, Italy
                [10 ]GRID grid.436156.3, ISNI 0000 0004 1775 9187, UniCredit, ; Milan, Italy
                [11 ]GRID grid.436156.3, ISNI 0000 0004 1775 9187, UniCredit, ; Rome, Italy
                [12 ]ENEA Portici Research Center, Portici, Italy
                [13 ]GRID grid.436156.3, ISNI 0000 0004 1775 9187, UniCredit, ; Rome, Italy
                GRID grid.434554.7, ISNI 0000 0004 1758 4137, European Commission, Joint Research Centre, Directorate A-Strategy, Work Programme and Resources, Scientific Development Unit, ; Via E. Fermi 2749, 21027 Ispra, VA Italy
                Article
                11
                10.1007/978-3-030-66981-2_11
                7808169
                4ff0046d-38f6-4d4a-996e-5faa071fb17a
                © The Author(s) 2021

                Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

                The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

                History
                Categories
                Article
                Custom metadata
                © Springer Nature Switzerland AG 2021

                economic and financial forecasting,neural time series forecasting,deep learning,recurrent neural networks,long short-term memory networks,word embedding,news analysis

                Comments

                Comment on this article