29
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Interrater Reliability of mHealth App Rating Measures: Analysis of Top Depression and Smoking Cessation Apps

      research-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          There are over 165,000 mHealth apps currently available to patients, but few have undergone an external quality review. Furthermore, no standardized review method exists, and little has been done to examine the consistency of the evaluation systems themselves.

          Objective

          We sought to determine which measures for evaluating the quality of mHealth apps have the greatest interrater reliability.

          Methods

          We identified 22 measures for evaluating the quality of apps from the literature. A panel of 6 reviewers reviewed the top 10 depression apps and 10 smoking cessation apps from the Apple iTunes App Store on these measures. Krippendorff’s alpha was calculated for each of the measures and reported by app category and in aggregate.

          Results

          The measure for interactiveness and feedback was found to have the greatest overall interrater reliability (alpha=.69). Presence of password protection (alpha=.65), whether the app was uploaded by a health care agency (alpha=.63), the number of consumer ratings (alpha=.59), and several other measures had moderate interrater reliability (alphas>.5). There was the least agreement over whether apps had errors or performance issues (alpha=.15), stated advertising policies (alpha=.16), and were easy to use (alpha=.18). There were substantial differences in the interrater reliabilities of a number of measures when they were applied to depression versus smoking apps.

          Conclusions

          We found wide variation in the interrater reliability of measures used to evaluate apps, and some measures are more robust across categories of apps than others. The measures with the highest degree of interrater reliability tended to be those that involved the least rater discretion. Clinical quality measures such as effectiveness, ease of use, and performance had relatively poor interrater reliability. Subsequent research is needed to determine consistent means for evaluating the performance of apps. Patients and clinicians should consider conducting their own assessments of apps, in conjunction with evaluating information from reviews.

          Related collections

          Most cited references28

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Finding a Depression App: A Review and Content Analysis of the Depression App Marketplace

          Background Depression is highly prevalent and causes considerable suffering and disease burden despite the existence of wide-ranging treatment options. Mobile phone apps offer the potential to help close this treatment gap by confronting key barriers to accessing support for depression. Objectives Our goal was to identify and characterize the different types of mobile phone depression apps available in the marketplace. Methods A search for depression apps was conducted on the app stores of the five major mobile phone platforms: Android, iPhone, BlackBerry, Nokia, and Windows. Apps were included if they focused on depression and were available to people who self-identify as having depression. Data were extracted from the app descriptions found in the app stores. Results Of the 1054 apps identified by the search strategy, nearly one-quarter (23.0%, 243/1054) unique depression apps met the inclusion criteria. Over one-quarter (27.7%, 210/758) of the excluded apps failed to mention depression in the title or description. Two-thirds of the apps had as their main purpose providing therapeutic treatment (33.7%, 82/243) or psychoeducation (32.1%, 78/243). The other main purpose categories were medical assessment (16.9%, 41/243), symptom management (8.2%, 20/243), and supportive resources (1.6%, 4/243). A majority of the apps failed to sufficiently describe their organizational affiliation (65.0%, 158/243) and content source (61.7%, 150/243). There was a significant relationship (χ 2 5=50.5, P<.001) between the main purpose of the app and the reporting of content source, with most medical assessment apps reporting their content source (80.5%, 33/41). A fifth of the apps featured an e-book (20.6%, 50/243), audio therapy (16.9%, 41/243), or screening (16.9%, 41/243) function. Most apps had a dynamic user interface (72.4%, 176/243) and used text as the main type of media (51.9%, 126/243), and over a third (14.4%, 35/243) incorporated more than one form of media. Conclusion Without guidance, finding an appropriate depression app may be challenging, as the search results yielded non-depression–specific apps to depression apps at a 3:1 ratio. Inadequate reporting of organization affiliation and content source increases the difficulty of assessing the credibility and reliability of the app. While certification and vetting initiatives are underway, this study demonstrates the need for standardized reporting in app stores to help consumers select appropriate tools, particularly among those classified as medical devices.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            In search of a few good apps.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Smartphone apps as a source of cancer information: changing trends in health information-seeking behavior.

              There is an increased interest in smartphone applications as a tool for delivery of health-care information. There have been no studies which evaluated the availability and content of cancer-related smartphone applications. This study aims to identify and analyze cancer-related applications available on the Apple iTunes platform. The Apple iTunes store was searched for cancer-related smartphone applications on July 29, 2011. The content of the applications was analyzed for cost, type of information, validity, and involvement of health-care agencies. A total of 77 relevant applications were identified. There were 24.6 % apps uploaded by health-care agencies, and 36 % of the apps were aimed at health-care workers. Among the apps, 55.8 % provided scientifically validated data. The difference in scientific validity between the apps aimed at general population versus health-care professionals was statistically significant (P < 0.01). Seventy-nine percent of the apps uploaded by health-care agencies were found to be backed by scientific data. There is lack of cancer-related applications with scientifically backed data. There is a need to improve the accountability and reliability of cancer-related smartphone applications and encourage participation by health-care agencies to ensure patient safety.
                Bookmark

                Author and article information

                Contributors
                Journal
                JMIR Mhealth Uhealth
                JMIR Mhealth Uhealth
                JMU
                JMIR mHealth and uHealth
                JMIR Publications Inc. (Toronto, Canada )
                2291-5222
                Jan-Mar 2016
                10 February 2016
                : 4
                : 1
                : e15
                Affiliations
                [1] 1Payer+Provider Syndicate Boston, MAUnited States
                [2] 2Harvard Longwood Psychiatry Residency Training Program Boston, MAUnited States
                [3] 3Brigham and Women’s Hospital Department of Psychiatry Harvard Medical School Boston, MAUnited States
                [4] 4University of California Davis Medical School Department of Psychiatry Sacramento, CAUnited States
                [5] 5Brigham and Women’s Hospital Harvard Medical School Boston, MAUnited States
                Author notes
                Corresponding Author: Adam C Powell powell@ 123456payerprovider.com
                Author information
                http://orcid.org/0000-0001-6519-3120
                http://orcid.org/0000-0002-5362-7937
                http://orcid.org/0000-0002-1696-3122
                http://orcid.org/0000-0001-5407-8725
                http://orcid.org/0000-0001-9026-1623
                http://orcid.org/0000-0003-2986-3937
                http://orcid.org/0000-0002-2166-0521
                Article
                v4i1e15
                10.2196/mhealth.5176
                4766362
                26863986
                577c0558-86d5-41d4-a0d0-52e3747e6694
                ©Adam C Powell, John Torous, Steven Chan, Geoffrey Stephen Raynor, Erik Shwarts, Meghan Shanahan, Adam B Landman. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 10.02.2016.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.

                History
                : 28 September 2015
                : 17 October 2015
                : 5 November 2015
                : 29 November 2015
                Categories
                Original Paper
                Original Paper

                mobile applications,mental health,evaluation studies,health apps,ratings

                Comments

                Comment on this article