5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Chimpanzee face recognition from videos in the wild using deep learning

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Wild ape face recognition using artificial intelligence opens the way for fully automated analysis of large-scale video datasets.

          Abstract

          Video recording is now ubiquitous in the study of animal behavior, but its analysis on a large scale is prohibited by the time and resources needed to manually process large volumes of data. We present a deep convolutional neural network (CNN) approach that provides a fully automated pipeline for face detection, tracking, and recognition of wild chimpanzees from long-term video records. In a 14-year dataset yielding 10 million face images from 23 individuals over 50 hours of footage, we obtained an overall accuracy of 92.5% for identity recognition and 96.2% for sex recognition. Using the identified faces, we generated co-occurrence matrices to trace changes in the social network structure of an aging population. The tools we developed enable easy processing and annotation of video datasets, including those from other species. Such automated analysis unveils the future potential of large-scale longitudinal video archives to address fundamental questions in behavior and conservation.

          Related collections

          Most cited references23

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Constructing, conducting and interpreting animal social network analysis

          Summary Animal social networks are descriptions of social structure which, aside from their intrinsic interest for understanding sociality, can have significant bearing across many fields of biology. Network analysis provides a flexible toolbox for testing a broad range of hypotheses, and for describing the social system of species or populations in a quantitative and comparable manner. However, it requires careful consideration of underlying assumptions, in particular differentiating real from observed networks and controlling for inherent biases that are common in social data. We provide a practical guide for using this framework to analyse animal social systems and test hypotheses. First, we discuss key considerations when defining nodes and edges, and when designing methods for collecting data. We discuss different approaches for inferring social networks from these data and displaying them. We then provide an overview of methods for quantifying properties of nodes and networks, as well as for testing hypotheses concerning network structure and network processes. Finally, we provide information about assessing the power and accuracy of an observed network. Alongside this manuscript, we provide appendices containing background information on common programming routines and worked examples of how to perform network analysis using the r programming language. We conclude by discussing some of the major current challenges in social network analysis and interesting future directions. In particular, we highlight the under‐exploited potential of experimental manipulations on social networks to address research questions.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            idTracker: tracking individuals in a group by automatic identification of unmarked animals.

            Animals in groups touch each other, move in paths that cross, and interact in complex ways. Current video tracking methods sometimes switch identities of unmarked individuals during these interactions. These errors propagate and result in random assignments after a few minutes unless manually corrected. We present idTracker, a multitracking algorithm that extracts a characteristic fingerprint from each animal in a video recording of a group. It then uses these fingerprints to identify every individual throughout the video. Tracking by identification prevents propagation of errors, and the correct identities can be maintained indefinitely. idTracker distinguishes animals even when humans cannot, such as for size-matched siblings, and reidentifies animals after they temporarily disappear from view or across different videos. It is robust, easy to use and general. We tested it on fish (Danio rerio and Oryzias latipes), flies (Drosophila melanogaster), ants (Messor structor) and mice (Mus musculus).
              Bookmark
              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              Return of the Devil in the Details: Delving Deep into Convolutional Nets

                Bookmark

                Author and article information

                Journal
                Sci Adv
                Sci Adv
                SciAdv
                advances
                Science Advances
                American Association for the Advancement of Science
                2375-2548
                September 2019
                04 September 2019
                : 5
                : 9
                : eaaw0736
                Affiliations
                [1 ]Primate Models for Behavioural Evolution Lab, Institute of Cognitive and Evolutionary Anthropology, University of Oxford, Oxford, UK.
                [2 ]Visual Geometry Group, Department of Engineering Science, University of Oxford, Oxford, UK.
                [3 ]Primate Research Institute, Kyoto University, Inuyama, Japan.
                [4 ]Department of Zoology, University of Oxford, Oxford, UK.
                [5 ]Gorongosa National Park, Sofala, Mozambique.
                [6 ]Interdisciplinary Center for Archaeology and Evolution of Human Behaviour (ICArEHB), Universidade do Algarve, Faro, Portugal.
                [7 ]Centre for Functional Ecology–Science for People & the Planet, Universidade de Coimbra, Coimbra, Portugal.
                Author notes
                [* ]Corresponding author. Email: daniel.schofield@ 123456anthro.ox.ac.uk (D.S.); arsha@ 123456robots.ox.ac.uk (A.N.)
                [†]

                These authors contributed equally to this work.

                Author information
                http://orcid.org/0000-0002-3308-0209
                http://orcid.org/0000-0003-2190-9013
                http://orcid.org/0000-0002-8945-8573
                http://orcid.org/0000-0001-7289-6414
                http://orcid.org/0000-0003-4542-3720
                Article
                aaw0736
                10.1126/sciadv.aaw0736
                6726454
                31517043
                9801ddd1-568b-4487-b30a-7e7bc09acfaa
                Copyright © 2019 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works. Distributed under a Creative Commons Attribution NonCommercial License 4.0 (CC BY-NC).

                This is an open-access article distributed under the terms of the Creative Commons Attribution-NonCommercial license, which permits use, distribution, and reproduction in any medium, so long as the resultant use is not for commercial advantage and provided the original work is properly cited.

                History
                : 15 November 2018
                : 02 August 2019
                Funding
                Funded by: doi http://dx.doi.org/10.13039/501100000275, Leverhulme Trust;
                Award ID: PLP-2016-114); MEXT-JSPS (#16H06283) and LGP-U04,
                Funded by: doi http://dx.doi.org/10.13039/501100001691, Japan Society for the Promotion of Science;
                Funded by: EPSRC;
                Award ID: EP/M013774/1
                Funded by: Google PhD fellowship;
                Funded by: Clarendon Fund;
                Funded by: Boise Trust Fund;
                Categories
                Research Article
                Research Articles
                SciAdv r-articles
                Research Methods
                Custom metadata
                Judith Urtula

                Comments

                Comment on this article