4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      HADLN: Hybrid Attention-Based Deep Learning Network for Automated Arrhythmia Classification

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In recent years, with the development of artificial intelligence, deep learning model has achieved initial success in ECG data analysis, especially the detection of atrial fibrillation. In order to solve the problems of ignoring the correlation between contexts and gradient dispersion in traditional deep convolution neural network model, the hybrid attention-based deep learning network (HADLN) method is proposed to implement arrhythmia classification. The HADLN can make full use of the advantages of residual network (ResNet) and bidirectional long–short-term memory (Bi-LSTM) architecture to obtain fusion features containing local and global information and improve the interpretability of the model through the attention mechanism. The method is trained and verified by using the PhysioNet 2017 challenge dataset. Without loss of generality, the ECG signal is classified into four categories, including atrial fibrillation, noise, other, and normal signals. By combining the fusion features and the attention mechanism, the learned model has a great improvement in classification performance and certain interpretability. The experimental results show that the proposed HADLN method can achieve precision of 0.866, recall of 0.859, accuracy of 0.867, and F1-score of 0.880 on 10-fold cross-validation.

          Related collections

          Most cited references38

          • Record: found
          • Abstract: not found
          • Conference Proceedings: not found

          Deep Residual Learning for Image Recognition

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Long Short-Term Memory

            Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Deep learning in neural networks: An overview

              In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarizes relevant work, much of it from the previous millennium. Shallow and Deep Learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Physiol
                Front Physiol
                Front. Physiol.
                Frontiers in Physiology
                Frontiers Media S.A.
                1664-042X
                05 July 2021
                2021
                : 12
                : 683025
                Affiliations
                [1] 1School of Information Science and Technology, Zhejiang Sci-Tech University , Hangzhou, China
                [2] 2Department of Clinical Engineering, The Second Affiliated Hospital, School of Medicine, Zhejiang University , Hangzhou, China
                [3] 3Department of Biomedical Engineering, Zhejiang University , Hangzhou, China
                Author notes

                Edited by: Linwei Wang, Rochester Institute of Technology, United States

                Reviewed by: Saman Parvaneh, Edwards Lifesciences, United States; Heye Zhang, Sun Yat-sen University, China

                *Correspondence: Zhikang Wang, 2192009@ 123456zju.edu.cn

                This article was submitted to Computational Physiology and Medicine, a section of the journal Frontiers in Physiology

                Article
                10.3389/fphys.2021.683025
                8289344
                34290619
                e2599671-0fd3-4410-8095-e651a4f243e9
                Copyright © 2021 Jiang, Gu, Li, Wei, Zhang, Wang and Xia.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 19 March 2021
                : 27 May 2021
                Page count
                Figures: 7, Tables: 5, Equations: 17, References: 38, Pages: 10, Words: 0
                Funding
                Funded by: Natural Science Foundation of Zhejiang Province 10.13039/501100004731
                Categories
                Physiology
                Original Research

                Anatomy & Physiology
                arrhythmia classification,deep learning,bidirectional lstm,resnet,attention mechanism

                Comments

                Comment on this article