• Record: found
  • Abstract: found
  • Article: found
Is Open Access

Reliability and validity of angle measurements using radiograph and smartphone applications: experimental research on protractor

Read this article at

      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


      [Purpose] The present study aimed to demonstrate the following by using measurements for the definite angles provided by the digital protractor: inter-rater reliability and validity in radiograph measurements and smartphone application measurements. [Subjects and Methods] The subject angles were 26 angles between 15° and 180° that were selected randomly using a computer. Three examiners measured the angles using the radiograph and smartphone application. The radiograph was obtained at a position 250 cm from the chest shooting cassette holder. The smartphone photograph was obtained at positions 50, 100, 150, 200, and 250 cm from the holder. [Results] Under all conditions, intra-class correlation coefficients showed 0.999. The correlation coefficient was 0.999 for all conditions. The mean absolute difference to the protractor was ≤0.28° for all conditions. [Conclusion] In comparison with the protractor, radiograph measurements and smartphone application measurements, the results of the present study showed high inter-rater reliability, validity, and small error. The results indicated that radiograph and smartphone application measurements could be used as criteria of validity in angle measurements. It supported the legitimacy of high-quality previous studies that used radiograph measurements as a criterion for validity.

      Related collections

      Most cited references 16

      • Record: found
      • Abstract: found
      • Article: not found

      The measurement of observer agreement for categorical data.

       G Koch,  J R Landis (1977)
      This paper presents a general statistical methodology for the analysis of multivariate categorical data arising from observer reliability studies. The procedure essentially involves the construction of functions of the observed proportions which are directed at the extent to which the observers agree among themselves and the construction of test statistics for hypotheses involving these functions. Tests for interobserver bias are presented in terms of first-order marginal homogeneity and measures of interobserver agreement are developed as generalized kappa-type statistics. These procedures are illustrated with a clinical diagnosis example from the epidemiological literature.
        • Record: found
        • Abstract: found
        • Article: not found

        Interpreting change scores of tests and measures used in physical therapy.

        Over the past decade, the methods and science used to describe changes in outcomes of physical therapy services have become more refined. Recently, emphasis has been placed not only on changes beyond expected measurement error, but also on the identification of changes that make a real difference in the lives of patients and families. This article will highlight a case example of how to determine and interpret "clinically significant change" from both of these perspectives. The authors also examine how to use item maps within an item response theory model to enhance the interpretation of change at a content level. Recommendations are provided for physical therapists who are interpreting changes in the context of clinical practice, case reports, and intervention research. These recommendations include a greater application of indexes that help interpret the meaning of clinically significant change to multiple clinical, research, consumer, and payer communities.
          • Record: found
          • Abstract: found
          • Article: not found

          Statistical techniques for comparing measurers and methods of measurement: a critical review.

           John Ludbrook (2002)
          1. Clinical and experimental pharmacologists and physiologists often wish to compare two methods of measurement, or two measurers. 2. Biostatisticians insist that what should be sought is not agreement between methods or measurers, but disagreement or bias. 3. If measurements have been made on a continuous scale, the main choice is between the Altman-Bland method of differences and least products regression analysis. It is argued that although the former is relatively simple to execute, it does not distinguish adequately between fixed and proportional bias. Least products regression analysis, although more difficult to execute, does achieve this goal. There is almost universal agreement among biostatisticians that the Pearson product-moment correlation coefficient (r) is valueless as a test for bias. 4. If measurements have been made on a categorical scale, unordered or ordered, the most popular method of analysis is to use the kappa statistic. If the categories are unordered, the unweighted kappa statistic (K) is appropriate. If the categories are ordered, as they are in most rating scales in clinical, psychological and epidemiological research, the weighted kappa statistic (K(w)) is preferable. But K(w) corresponds to the intraclass correlation coefficient, which, like r for continuous variables, is incapable of detecting bias. Simple techniques for detecting bias in the case of ordered categorical variables are described and commended to investigators.

            Author and article information

            [1) ] Faculty of Sports Science, Kyushu Kyoritsu University: 1-8 Jiyuugaoka, Yahatanishi-ku, Kitakyushu, Fukuoka 807-8585, Japan
            [2) ] Graduate School of Sport and Exercise Sciences, Osaka University of Health and Sport Sciences, Japan
            [3) ] Faculty of Education, Osaka Kyoiku University, Japan
            [4) ] Faculty of Physical Education, Osaka University of Health and Sport Science, Japan
            Author notes
            [* ]Corresponding author. Takenori Awatani (E-mail: awtn9831@ )
            J Phys Ther Sci
            J Phys Ther Sci
            Journal of Physical Therapy Science
            The Society of Physical Therapy Science
            21 October 2017
            October 2017
            : 29
            : 10
            : 1869-1873
            2017©by the Society of Physical Therapy Science. Published by IPEC Inc.

            This is an open-access article distributed under the terms of the Creative Commons Attribution Non-Commercial No Derivatives (by-nc-nd) License. (CC-BY-NC-ND 4.0: )

            Original Article


            Comment on this article