5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Wearable Technology to Detect Motor Fluctuations in Parkinson’s Disease Patients: Current State and Challenges

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Monitoring of motor symptom fluctuations in Parkinson’s disease (PD) patients is currently performed through the subjective self-assessment of patients. Clinicians require reliable information about a fluctuation’s occurrence to enable a precise treatment rescheduling and dosing adjustment. In this review, we analyzed the utilization of sensors for identifying motor fluctuations in PD patients and the application of machine learning techniques to detect fluctuations. The review process followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Ten studies were included between January 2010 and March 2021, and their main characteristics and results were assessed and documented. Five studies utilized daily activities to collect the data, four used concrete scenarios executing specific activities to gather the data, and only one utilized a combination of both situations. The accuracy for classification was 83.56–96.77%. In the studies evaluated, it was not possible to find a standard cleaning protocol for the signal captured, and there is significant heterogeneity in the models utilized and in the different features introduced in the models (using spatiotemporal characteristics, frequential characteristics, or both). The two most influential factors in the good performance of the classification problem are the type of features utilized and the type of model.

          Related collections

          Most cited references41

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement

          Systematic reviews should build on a protocol that describes the rationale, hypothesis, and planned methods of the review; few reviews report whether a protocol exists. Detailed, well-described protocols can facilitate the understanding and appraisal of the review methods, as well as the detection of modifications to methods and selective reporting in completed reviews. We describe the development of a reporting guideline, the Preferred Reporting Items for Systematic reviews and Meta-Analyses for Protocols 2015 (PRISMA-P 2015). PRISMA-P consists of a 17-item checklist intended to facilitate the preparation and reporting of a robust protocol for the systematic review. Funders and those commissioning reviews might consider mandating the use of the checklist to facilitate the submission of relevant protocol information in funding applications. Similarly, peer reviewers and editors can use the guidance to gauge the completeness and transparency of a systematic review protocol submitted for publication in a journal or other medium.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Interrater reliability: the kappa statistic

            The kappa statistic is frequently used to test interrater reliability. The importance of rater reliability lies in the fact that it represents the extent to which the data collected in the study are correct representations of the variables measured. Measurement of the extent to which data collectors (raters) assign the same score to the same variable is called interrater reliability. While there have been a variety of methods to measure interrater reliability, traditionally it was measured as percent agreement, calculated as the number of agreement scores divided by the total number of scores. In 1960, Jacob Cohen critiqued use of percent agreement due to its inability to account for chance agreement. He introduced the Cohen’s kappa, developed to account for the possibility that raters actually guess on at least some variables due to uncertainty. Like most correlation statistics, the kappa can range from −1 to +1. While the kappa is one of the most commonly used statistics to test interrater reliability, it has limitations. Judgments about what level of kappa should be acceptable for health research are questioned. Cohen’s suggested interpretation may be too lenient for health related studies because it implies that a score as low as 0.41 might be acceptable. Kappa and percent agreement are compared, and levels for both kappa and percent agreement that should be demanded in healthcare studies are suggested.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation

                Bookmark

                Author and article information

                Contributors
                Role: Academic Editor
                Role: Academic Editor
                Journal
                Sensors (Basel)
                Sensors (Basel)
                sensors
                Sensors (Basel, Switzerland)
                MDPI
                1424-8220
                18 June 2021
                June 2021
                : 21
                : 12
                : 4188
                Affiliations
                [1 ]Programa en Ingeniería Biomédica (PhD), ETSI Telecomunicación, Universidad Politécnica de Madrid (UPM), Avenida Complutense, 30, 28040 Madrid, Spain; m.barrachina@ 123456alumnos.upm.es
                [2 ]Centro de Estudios e Innovación en Gestión del Conocimiento (CEIEC), Universidad Francisco de Vitoria, 28223 Pozuelo de Alarcón, Spain; a.maitin@ 123456ceiec.es
                [3 ]Department de Matemática Aplicada a las TICs, ETSI Telecomunicación, Universidad Politécnica de Madrid (UPM), Avenida Complutense, 30, 28040 Madrid, Spain
                [4 ]Facultad de Ciencias Experimentales, Universidad Francisco de Vitoria, 28223 Pozuelo de Alarcón, Spain
                [5 ]Brain Damage Unit, Hospital Beata María Ana, 28007 Madrid, Spain
                Author notes
                [* ]Correspondence: carmen.sanchez.avila@ 123456upm.es (C.S.-Á.); p.romero.prof@ 123456ufv.es (J.P.R.); Tel.: +34-910672283 (C.S.-Á.); +34-917091400 (J.P.R.)
                Author information
                https://orcid.org/0000-0002-7690-1011
                https://orcid.org/0000-0002-3190-1296
                Article
                sensors-21-04188
                10.3390/s21124188
                8234127
                34207198
                4656fc7d-a757-46c1-a59a-49d8e2237d61
                © 2021 by the authors.

                Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license ( https://creativecommons.org/licenses/by/4.0/).

                History
                : 20 May 2021
                : 16 June 2021
                Categories
                Review

                Biomedical engineering
                parkinson´s disease,motor fluctuations,sensors,motor symptoms,treatment
                Biomedical engineering
                parkinson´s disease, motor fluctuations, sensors, motor symptoms, treatment

                Comments

                Comment on this article