2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Cavitation recognition of axial piston pumps in noisy environment based on Grad‐CAM visualization technique

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references47

          • Record: found
          • Abstract: not found
          • Article: not found

          Gradient-based learning applied to document recognition

            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

              Training Deep Neural Networks is complicated by the fact that the distribution of each layer's inputs changes during training, as the parameters of the previous layers change. This slows down the training by requiring lower learning rates and careful parameter initialization, and makes it notoriously hard to train models with saturating nonlinearities. We refer to this phenomenon as internal covariate shift, and address the problem by normalizing layer inputs. Our method draws its strength from making normalization a part of the model architecture and performing the normalization for each training mini-batch. Batch Normalization allows us to use much higher learning rates and be less careful about initialization. It also acts as a regularizer, in some cases eliminating the need for Dropout. Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin. Using an ensemble of batch-normalized networks, we improve upon the best published result on ImageNet classification: reaching 4.9% top-5 validation error (and 4.8% test error), exceeding the accuracy of human raters.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                CAAI Transactions on Intelligence Technology
                CAAI Trans on Intel Tech
                Institution of Engineering and Technology (IET)
                2468-2322
                2468-2322
                April 28 2022
                Affiliations
                [1 ]State Key Laboratory of Mechanical System and Vibration Shanghai Jiao Tong University Shanghai China
                [2 ]State Key Laboratory of Fluid Power and Mechatronic Systems Zhejiang University Hangzhou China
                [3 ]MoE Key Laboratory of Artificial Intelligence, AI Institute Shanghai Jiao Tong University Shanghai China
                [4 ]China Electronic Product Reliability and Environmental Testing Research Institute Guangzhou China
                [5 ]Guangdong Provincial Key Laboratory of Electronic Information Products Reliability Technology Guangzhou China
                Article
                10.1049/cit2.12101
                e27e8151-9bcb-40e8-9535-ca125f55b53e
                © 2022

                http://creativecommons.org/licenses/by/4.0/

                http://doi.wiley.com/10.1002/tdm_license_1.1

                History

                Comments

                Comment on this article