11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      High-resolution behavioral mapping of electric fishes in Amazonian habitats

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The study of animal behavior has been revolutionized by sophisticated methodologies that identify and track individuals in video recordings. Video recording of behavior, however, is challenging for many species and habitats including fishes that live in turbid water. Here we present a methodology for identifying and localizing weakly electric fishes on the centimeter scale with subsecond temporal resolution based solely on the electric signals generated by each individual. These signals are recorded with a grid of electrodes and analyzed using a two-part algorithm that identifies the signals from each individual fish and then estimates the position and orientation of each fish using Bayesian inference. Interestingly, because this system involves eavesdropping on electrocommunication signals, it permits monitoring of complex social and physical interactions in the wild. This approach has potential for large-scale non-invasive monitoring of aquatic habitats in the Amazon basin and other tropical freshwater systems.

          Related collections

          Most cited references33

          • Record: found
          • Abstract: found
          • Article: not found

          High-throughput Ethomics in Large Groups of Drosophila

          We present a camera-based method for automatically quantifying the individual and social behaviors of fruit flies, Drosophila melanogaster, interacting within a planar arena. Our system includes machine vision algorithms that accurately track many individuals without swapping identities and classification algorithms that detect behaviors. The data may be represented as an ethogram that plots the time course of behaviors exhibited by each fly, or as a vector that concisely captures the statistical properties of all behaviors displayed within a given period. We found that behavioral differences between individuals are consistent over time and are sufficient to accurately predict gender and genotype. In addition, we show that the relative positions of flies during social interactions vary according to gender, genotype, and social environment. We expect that our software, which permits high-throughput screening, will complement existing molecular methods available in Drosophila, facilitating new investigations into the genetic and cellular basis of behavior.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            idTracker: tracking individuals in a group by automatic identification of unmarked animals.

            Animals in groups touch each other, move in paths that cross, and interact in complex ways. Current video tracking methods sometimes switch identities of unmarked individuals during these interactions. These errors propagate and result in random assignments after a few minutes unless manually corrected. We present idTracker, a multitracking algorithm that extracts a characteristic fingerprint from each animal in a video recording of a group. It then uses these fingerprints to identify every individual throughout the video. Tracking by identification prevents propagation of errors, and the correct identities can be maintained indefinitely. idTracker distinguishes animals even when humans cannot, such as for size-matched siblings, and reidentifies animals after they temporarily disappear from view or across different videos. It is robust, easy to use and general. We tested it on fish (Danio rerio and Oryzias latipes), flies (Drosophila melanogaster), ants (Messor structor) and mice (Mus musculus).
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              JAABA: interactive machine learning for automatic annotation of animal behavior.

              We present a machine learning-based system for automatically computing interpretable, quantitative measures of animal behavior. Through our interactive system, users encode their intuition about behavior by annotating a small set of video frames. These manual labels are converted into classifiers that can automatically annotate behaviors in screen-scale data sets. Our general-purpose system can create a variety of accurate individual and social behavior classifiers for different organisms, including mice and adult and larval Drosophila.
                Bookmark

                Author and article information

                Contributors
                manusmad@jhu.edu
                Journal
                Sci Rep
                Sci Rep
                Scientific Reports
                Nature Publishing Group UK (London )
                2045-2322
                11 April 2018
                11 April 2018
                2018
                : 8
                : 5830
                Affiliations
                [1 ]ISNI 0000 0001 2171 9311, GRID grid.21107.35, Mind/Brain Institute, , Johns Hopkins University, ; Baltimore, Maryland USA
                [2 ]ISNI 0000 0001 2171 9311, GRID grid.21107.35, Mechanical Engineering Department, , Johns Hopkins University, ; Baltimore, Maryland USA
                [3 ]ISNI 0000 0001 2166 4955, GRID grid.260896.3, Federated Department of Biological Sciences, , New Jersey Institute of Technology, ; Newark, New Jersey USA
                Author information
                http://orcid.org/0000-0001-5317-8492
                http://orcid.org/0000-0003-4321-6816
                http://orcid.org/0000-0003-2502-3770
                Article
                24035
                10.1038/s41598-018-24035-5
                5895713
                29643472
                65032950-6cca-46d1-b23d-c981fce59586
                © The Author(s) 2018

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 17 May 2017
                : 22 March 2018
                Categories
                Article
                Custom metadata
                © The Author(s) 2018

                Uncategorized
                Uncategorized

                Comments

                Comment on this article