4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Two-dimensional video-based analysis of human gait using pose estimation

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Human gait analysis is often conducted in clinical and basic research, but many common approaches (e.g., three-dimensional motion capture, wearables) are expensive, immobile, data-limited, and require expertise. Recent advances in video-based pose estimation suggest potential for gait analysis using two-dimensional video collected from readily accessible devices (e.g., smartphones). To date, several studies have extracted features of human gait using markerless pose estimation. However, we currently lack evaluation of video-based approaches using a dataset of human gait for a wide range of gait parameters on a stride-by-stride basis and a workflow for performing gait analysis from video. Here, we compared spatiotemporal and sagittal kinematic gait parameters measured with OpenPose (open-source video-based human pose estimation) against simultaneously recorded three-dimensional motion capture from overground walking of healthy adults. When assessing all individual steps in the walking bouts, we observed mean absolute errors between motion capture and OpenPose of 0.02 s for temporal gait parameters (i.e., step time, stance time, swing time and double support time) and 0.049 m for step lengths. Accuracy improved when spatiotemporal gait parameters were calculated as individual participant mean values: mean absolute error was 0.01 s for temporal gait parameters and 0.018 m for step lengths. The greatest difference in gait speed between motion capture and OpenPose was less than 0.10 m s −1. Mean absolute error of sagittal plane hip, knee and ankle angles between motion capture and OpenPose were 4.0°, 5.6° and 7.4°. Our analysis workflow is freely available, involves minimal user input, and does not require prior gait analysis expertise. Finally, we offer suggestions and considerations for future applications of pose estimation for human gait analysis.

          Author summary

          There is a growing interest among clinicians and researchers to use novel pose estimation algorithms that automatically track human movement to analyze human gait. Gait analysis is routinely conducted in designated laboratories with specialized equipment. On the other hand, pose estimation relies on digital videos that can be recorded from household devices such as a smartphone. As a result, these new techniques make it possible to move beyond the laboratory and perform gait analysis in other settings such as the home or clinic. Before such techniques are adopted, we identify a critical need for comparing outcome parameters against three-dimensional motion capture and to evaluate how camera viewpoint affect outcome parameters. We used simultaneous motion capture and left- and right-side video recordings of healthy human gait and calculated spatiotemporal gait parameters and lower-limb joint angles. We find that our provided workflow estimates spatiotemporal gait parameters together with hip and knee angles with the accuracy and precision needed to detect changes in the gait pattern. We demonstrate that the position of the participant relative to the camera affect spatial measures such as step length and discuss the limitations posed by the current approach.

          Related collections

          Most cited references37

          • Record: found
          • Abstract: found
          • Article: not found

          DeepLabCut: markerless pose estimation of user-defined body parts with deep learning

          Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Using DeepLabCut for 3D markerless pose estimation across species and behaviors

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields

              Realtime multi-person 2D pose estimation is a key component in enabling machines to have an understanding of people in images and videos. In this work, we present a realtime approach to detect the 2D pose of multiple people in an image. The proposed method uses a nonparametric representation, which we refer to as Part Affinity Fields (PAFs), to learn to associate body parts with individuals in the image. This bottom-up system achieves high accuracy and realtime performance, regardless of the number of people in the image. In previous work, PAFs and body part location estimation were refined simultaneously across training stages. We demonstrate that a PAF-only refinement rather than both PAF and body part location refinement results in a substantial increase in both runtime performance and accuracy. We also present the first combined body and foot keypoint detector, based on an internal annotated foot dataset that we have publicly released. We show that the combined detector not only reduces the inference time compared to running them sequentially, but also maintains the accuracy of each component individually. This work has culminated in the release of OpenPose, the first open-source realtime system for multi-person 2D pose detection, including body, foot, hand, and facial keypoints.
                Bookmark

                Author and article information

                Contributors
                Role: ConceptualizationRole: Data curationRole: Formal analysisRole: InvestigationRole: MethodologyRole: ResourcesRole: SoftwareRole: ValidationRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Role: ConceptualizationRole: InvestigationRole: MethodologyRole: Writing – review & editing
                Role: ConceptualizationRole: Funding acquisitionRole: InvestigationRole: MethodologyRole: Project administrationRole: ResourcesRole: SupervisionRole: VisualizationRole: Writing – original draftRole: Writing – review & editing
                Role: Editor
                Journal
                PLoS Comput Biol
                PLoS Comput Biol
                plos
                ploscomp
                PLoS Computational Biology
                Public Library of Science (San Francisco, CA USA )
                1553-734X
                1553-7358
                23 April 2021
                April 2021
                : 17
                : 4
                : e1008935
                Affiliations
                [1 ] Center for Movement Studies, Kennedy Krieger Institute, Baltimore, Maryland, United States of America
                [2 ] Department of Physical Medicine and Rehabilitation, Johns Hopkins University School of Medicine, Baltimore, Maryland, United States of America
                [3 ] Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland, United States of America
                Hebrew University of Jerusalem, ISRAEL
                Author notes

                The authors have declared that no competing interests exist.

                Author information
                https://orcid.org/0000-0002-0088-8703
                https://orcid.org/0000-0001-7883-1945
                https://orcid.org/0000-0003-0797-6455
                Article
                PCOMPBIOL-D-20-01556
                10.1371/journal.pcbi.1008935
                8099131
                33891585
                2a134d72-1be7-4c74-a569-c2aa9fd9d8a7
                © 2021 Stenum et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 27 August 2020
                : 1 April 2021
                Page count
                Figures: 8, Tables: 7, Pages: 26
                Funding
                Funded by: funder-id http://dx.doi.org/10.13039/100000002, National Institutes of Health;
                Award ID: R21AG059184
                Award Recipient :
                This study was funded by NIH grant R21AG059184 to RTR. The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Research Article
                Biology and Life Sciences
                Physiology
                Biological Locomotion
                Gait Analysis
                Biology and Life Sciences
                Anatomy
                Body Limbs
                Legs
                Ankles
                Medicine and Health Sciences
                Anatomy
                Body Limbs
                Legs
                Ankles
                Research and Analysis Methods
                Imaging Techniques
                Video Recording
                Biology and Life Sciences
                Physiology
                Biological Locomotion
                Walking
                Biology and Life Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Pelvis
                Hip
                Medicine and Health Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Pelvis
                Hip
                Engineering and Technology
                Equipment
                Optical Equipment
                Cameras
                Biology and Life Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Skeletal Joints
                Knees
                Knee Joints
                Medicine and Health Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Skeletal Joints
                Knees
                Knee Joints
                Biology and Life Sciences
                Anatomy
                Body Limbs
                Legs
                Knees
                Knee Joints
                Medicine and Health Sciences
                Anatomy
                Body Limbs
                Legs
                Knees
                Knee Joints
                Biology and Life Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Skeletal Joints
                Knees
                Medicine and Health Sciences
                Anatomy
                Musculoskeletal System
                Skeleton
                Skeletal Joints
                Knees
                Biology and Life Sciences
                Anatomy
                Body Limbs
                Legs
                Knees
                Medicine and Health Sciences
                Anatomy
                Body Limbs
                Legs
                Knees
                Custom metadata
                vor-update-to-uncorrected-proof
                2021-05-05
                All data files are available from http://bytom.pja.edu.pl/projekty/hm-gpjatk/. Software is available from https://github.com/janstenum/GaitAnalysis-PoseEstimation.

                Quantitative & Systems biology
                Quantitative & Systems biology

                Comments

                Comment on this article