Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      DeepBAN: A Temporal Convolution-Based Communication Framework for Dynamic WBANs

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: found
          • Article: not found

          Long Short-Term Memory

          Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling

            In this paper we compare different types of recurrent units in recurrent neural networks (RNNs). Especially, we focus on more sophisticated units that implement a gating mechanism, such as a long short-term memory (LSTM) unit and a recently proposed gated recurrent unit (GRU). We evaluate these recurrent units on the tasks of polyphonic music modeling and speech signal modeling. Our experiments revealed that these advanced recurrent units are indeed better than more traditional recurrent units such as tanh units. Also, we found GRU to be comparable to LSTM. Presented in NIPS 2014 Deep Learning and Representation Learning Workshop
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Wireless Body Area Networks: A Survey

                Bookmark

                Author and article information

                Contributors
                Journal
                IEEE Transactions on Communications
                IEEE Trans. Commun.
                Institute of Electrical and Electronics Engineers (IEEE)
                0090-6778
                1558-0857
                October 2021
                October 2021
                : 69
                : 10
                : 6675-6690
                Article
                10.1109/TCOMM.2021.3094581
                a8c9a936-671e-4e6b-a7fe-8b3624ad5729
                © 2021

                https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html

                https://doi.org/10.15223/policy-029

                https://doi.org/10.15223/policy-037

                History

                Comments

                Comment on this article