13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Initial validation of an intelligent video surveillance system for automatic detection of dairy cattle lameness

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Introduction

          Lameness is a major welfare challenge facing the dairy industry worldwide. Monitoring herd lameness prevalence, and early detection and therapeutic intervention are important aspects of lameness control in dairy herds. The objective of this study was to evaluate the performance of a commercially available video surveillance system for automatic detection of dairy cattle lameness (CattleEye Ltd).

          Methods

          This was achieved by first measuring mobility score agreement between CattleEye and two veterinarians (Assessor 1 and Assessor 2), and second, by investigating the ability of the CattleEye system to detect cows with potentially painful foot lesions. We analysed 6,040 mobility scores collected from three dairy farms. Inter-rate agreement was estimated by calculating percentage agreement (PA), Cohen’s kappa ( κ) and Gwet’s agreement coefficient (AC). Data regarding the presence of foot lesions were also available for a subset of this dataset. The ability of the system to predict the presence of potentially painful foot lesions was tested against that of Assessor 1 by calculating measures of accuracy, using lesion records during the foot trimming sessions as reference.

          Results

          In general, inter-rater agreement between CattleEye and either human assessor was strong and similar to that between the human assessors, with PA and AC being consistently above 80% and 0.80, respectively. Kappa agreement between CattleEye and the human scorers was in line with previous studies (investigating agreement between human assessors) and within the fair to moderate agreement range. The system was more sensitive than Assessor 1 in identifying cows with potentially painful lesions, with 0.52 sensitivity and 0.81 specificity compared to the Assessor’s 0.29 and 0.89 respectively.

          Discussion

          This pilot study showed that the CattleEye system achieved scores comparable to that of two experienced veterinarians and was more sensitive than a trained veterinarian in detecting painful foot lesions.

          Related collections

          Most cited references26

          • Record: found
          • Abstract: not found
          • Article: not found

          The Measurement of Observer Agreement for Categorical Data

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Computing inter-rater reliability and its variance in the presence of high agreement.

            Pi (pi) and kappa (kappa) statistics are widely used in the areas of psychiatry and psychological testing to compute the extent of agreement between raters on nominally scaled data. It is a fact that these coefficients occasionally yield unexpected results in situations known as the paradoxes of kappa. This paper explores the origin of these limitations, and introduces an alternative and more stable agreement coefficient referred to as the AC1 coefficient. Also proposed are new variance estimators for the multiple-rater generalized pi and AC1 statistics, whose validity does not depend upon the hypothesis of independence between raters. This is an improvement over existing alternative variances, which depend on the independence assumption. A Monte-Carlo simulation study demonstrates the validity of these variance estimators for confidence interval construction, and confirms the value of AC1 as an improved alternative to existing inter-rater reliability statistics.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Bias, prevalence and kappa

              Since the introduction of Cohen's kappa as a chance-adjusted measure of agreement between two observers, several "paradoxes" in its interpretation have been pointed out. The difficulties occur because kappa not only measures agreement but is also affected in complex ways by the presence of bias between observers and by the distributions of data across the categories that are used ("prevalence"). In this paper, new indices that provide independent measures of bias and prevalence, as well as of observed agreement, are defined and a simple formula is derived that expresses kappa in terms of these three indices. When comparisons are made between agreement studies it can be misleading to report kappa values alone, and it is recommended that researchers also include quantitative indicators of bias and prevalence.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Vet Sci
                Front Vet Sci
                Front. Vet. Sci.
                Frontiers in Veterinary Science
                Frontiers Media S.A.
                2297-1769
                13 June 2023
                2023
                : 10
                : 1111057
                Affiliations
                Department of Livestock and One Health, Institute of Infection, Veterinary and Ecological Sciences, University of Liverpool, Leahurst Campus , Chester, United Kingdom
                Author notes

                Edited by: Alasdair James Charles Cook, University of Surrey, United Kingdom

                Reviewed by: Yi-Chun Lin, National Chung Hsing University, Taiwan; Andre Desrochers, Montreal University, Canada

                *Correspondence: Georgios Oikonomou, goikon@ 123456liv.ac.uk
                Article
                10.3389/fvets.2023.1111057
                10299827
                37383350
                13bb5a12-f865-459c-a5a3-c5402bd0d45d
                Copyright © 2023 Anagnostopoulos, Griffiths, Siachos, Neary, Smith and Oikonomou.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 23 December 2022
                : 25 May 2023
                Page count
                Figures: 1, Tables: 4, Equations: 0, References: 28, Pages: 8, Words: 5468
                Categories
                Veterinary Science
                Original Research
                Custom metadata
                Comparative and Clinical Medicine

                cattle lameness,automated system,foot lesions,mobility scoring,artificial intelligence

                Comments

                Comment on this article