Quantifying normal and parkinsonian gait features from home movies: Practical application of a deep learning-based 2D pose estimator.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Source:
      Publisher: Public Library of Science Country of Publication: United States NLM ID: 101285081 Publication Model: eCollection Cited Medium: Internet ISSN: 1932-6203 (Electronic) Linking ISSN: 19326203 NLM ISO Abbreviation: PLoS One Subsets: MEDLINE
    • Publication Information:
      Original Publication: San Francisco, CA : Public Library of Science
    • Subject Terms:
    • Abstract:
      Objective: Gait movies recorded in daily clinical practice are usually not filmed with specific devices, which prevents neurologists benefitting from leveraging gait analysis technologies. Here we propose a novel unsupervised approach to quantifying gait features and to extract cadence from normal and parkinsonian gait movies recorded with a home video camera by applying OpenPose, a deep learning-based 2D-pose estimator that can obtain joint coordinates from pictures or videos recorded with a monocular camera.
      Methods: Our proposed method consisted of two distinct phases: obtaining sequential gait features from movies by extracting body joint coordinates with OpenPose; and estimating cadence of periodic gait steps from the sequential gait features using the short-time pitch detection approach.
      Results: The cadence estimation of gait in its coronal plane (frontally viewed gait) as is frequently filmed in the daily clinical setting was successfully conducted in normal gait movies using the short-time autocorrelation function (ST-ACF). In cases of parkinsonian gait with prominent freezing of gait and involuntary oscillations, using ACF-based statistical distance metrics, we quantified the periodicity of each gait sequence; this metric clearly corresponded with the subjects' baseline disease statuses.
      Conclusion: The proposed method allows us to analyze gait movies that have been underutilized to date in a completely data-driven manner, and might broaden the range of movies for which gait analyses can be conducted.
      Competing Interests: The authors have declared that no competing interests exist.
    • References:
      J Biomech. 2016 Sep 6;49(13):2763-2769. (PMID: 27363617)
      J Biomech. 2018 Apr 11;71:30-36. (PMID: 29429622)
      Funct Neurol. 2017 Jan/Mar;32(1):28-34. (PMID: 28380321)
      Chaos. 2009 Jun;19(2):026113. (PMID: 19566273)
      Arch Phys Med Rehabil. 2005 May;86(5):1007-13. (PMID: 15895349)
      J Adv Res. 2013 Jul;4(4):331-44. (PMID: 25685438)
      Sensors (Basel). 2014 Feb 19;14(2):3362-94. (PMID: 24556672)
      IEEE Trans Pattern Anal Mach Intell. 2019 Jul 17;:null. (PMID: 31331883)
      Biomed Sci Instrum. 2005;41:329-34. (PMID: 15850127)
      BMC Bioinformatics. 2011 Mar 17;12:77. (PMID: 21414208)
      Sensors (Basel). 2011;11(8):7314-26. (PMID: 22164019)
      J Electromyogr Kinesiol. 2015 Oct;25(5):808-14. (PMID: 26159504)
      Mov Disord. 1998 Jan;13(1):61-9. (PMID: 9452328)
      Foot Ankle Int. 1997 Jul;18(7):427-31. (PMID: 9252813)
      Lancet Neurol. 2011 Aug;10(8):734-44. (PMID: 21777828)
      J Biomech. 2001 Feb;34(2):257-60. (PMID: 11165291)
      IEEE Trans Biomed Eng. 2017 Nov.;64(11):2584-2594. (PMID: 28026747)
      Sensors (Basel). 2010;10(8):7772-88. (PMID: 22163626)
    • Publication Date:
      Date Created: 20191115 Date Completed: 20200319 Latest Revision: 20200319
    • Publication Date:
      20240105
    • Accession Number:
      PMC6855634
    • Accession Number:
      10.1371/journal.pone.0223549
    • Accession Number:
      31725754