6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Real-time jellyfish classification and detection algorithm based on improved YOLOv4-tiny and improved underwater image enhancement algorithm

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The outbreak of jellyfish blooms poses a serious threat to human life and marine ecology. Therefore, jellyfish detection techniques have earned great interest. This paper investigates the jellyfish detection and classification algorithm based on optical images and deep learning theory. Firstly, we create a dataset comprising 11,926 images. A MSRCR underwater image enhancement algorithm with fusion is proposed. Finally, an improved YOLOv4-tiny algorithm is proposed by incorporating a CBMA module and optimizing the training method. The results demonstrate that the detection accuracy of the improved algorithm can reach 95.01%, the detection speed is 223FPS, both of which are better than the compared algorithms such as YOLOV4. In summary, our method can accurately and quickly detect jellyfish. The research in this paper lays the foundation for the development of an underwater jellyfish real-time monitoring system.

          Related collections

          Most cited references27

          • Record: found
          • Abstract: not found
          • Article: not found

          An Underwater Color Image Quality Evaluation Metric

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Color constancy.

            A quarter of a century ago, the first systematic behavioral experiments were performed to clarify the nature of color constancy-the effect whereby the perceived color of a surface remains constant despite changes in the spectrum of the illumination. At about the same time, new models of color constancy appeared, along with physiological data on cortical mechanisms and photographic colorimetric measurements of natural scenes. Since then, as this review shows, there have been many advances. The theoretical requirements for constancy have been better delineated and the range of experimental techniques has been greatly expanded; novel invariant properties of images and a variety of neural mechanisms have been identified; and increasing recognition has been given to the relevance of natural surfaces and scenes as laboratory stimuli. Even so, there remain many theoretical and experimental challenges, not least to develop an account of color constancy that goes beyond deterministic and relatively simple laboratory stimuli and instead deals with the intrinsically variable nature of surfaces and illuminations present in the natural world. Copyright © 2010 Elsevier Ltd. All rights reserved.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Human-Visual-System-Inspired Underwater Image Quality Measures

                Bookmark

                Author and article information

                Contributors
                gaomeijing@126.com
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                10 August 2023
                10 August 2023
                2023
                : 13
                : 12989
                Affiliations
                [1 ]GRID grid.43555.32, ISNI 0000 0000 8841 6246, School of Integrated Circuits and Electronics, , Beijing Institute of Technology, ; Beijing, 100081 China
                [2 ]GRID grid.413012.5, ISNI 0000 0000 8954 0417, The Key Laboratory for Special Fiber and Fiber Sensor of Hebei Province, College of Information Science and Engineering, , Yanshan University, ; Qinhuangdao, 066004 Hebei China
                [3 ]GRID grid.43555.32, ISNI 0000 0000 8841 6246, The Key Laboratory of Dynamics and Control of Flight Vehicle, Ministry of Education, School of Aerospace Engineering, , Beijing Institute of Technology, ; Beijing, 100081 China
                Article
                39851
                10.1038/s41598-023-39851-7
                10415266
                37563193
                74a7c8b1-605d-4776-8f8c-91e437f10ef7
                © Springer Nature Limited 2023

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 10 February 2023
                : 1 August 2023
                Funding
                Funded by: Science and Technology Support Projects of R&D Plans of Qinhuangdao City
                Award ID: No. 202004A001
                Award Recipient :
                Categories
                Article
                Custom metadata
                © Springer Nature Limited 2023

                Uncategorized
                optical imaging,marine biology,computer science
                Uncategorized
                optical imaging, marine biology, computer science

                Comments

                Comment on this article