8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Leveraging Implementation Science to Understand Factors Influencing Sustained Use of Mental Health Apps: a Narrative Review

      review-article

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Mental health (MH) smartphone applications (apps), which can aid in self-management of conditions such as depression and anxiety, have demonstrated dramatic growth over the past decade. However, their effectiveness and potential for sustained use remain uncertain. This narrative review leverages implementation science theory to explore factors influencing MH app uptake. The review is guided by the integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework and discusses the role of the innovation, its recipients, context, and facilitation in influencing successful implementation of MH apps. The review highlights critical literature published between 2015 and 2020 with a focus on depression and anxiety apps. Sources were identified via PubMed, Google Scholar, and Twitter using a range of keywords pertaining to MH apps. Findings suggest that for apps to be successful, they must be advantageous over alternative tools, relatively easy to navigate, and aligned with users’ needs, skills, and resources. Significantly more attention must be paid to the complex contexts in which MH app implementation is occurring in order to refine facilitation strategies. The evidence base is still uncertain regarding the effectiveness and usability of MH apps, and much can be learned from the apps we use daily; namely, simpler is better and plans to integrate full behavioral treatments into smartphone form may be misguided. Non-traditional funding mechanisms that are nimble, responsive, and encouraging of industry partnerships will be necessary to move the course of MH app development in the right direction.

          Related collections

          Most cited references83

          • Record: found
          • Abstract: found
          • Article: not found

          A typology of reviews: an analysis of 14 review types and associated methodologies.

          The expansion of evidence-based practice across sectors has lead to an increasing variety of review types. However, the diversity of terminology used means that the full potential of these review types may be lost amongst a confusion of indistinct and misapplied terms. The objective of this study is to provide descriptive insight into the most common types of reviews, with illustrative examples from health and health information domains. Following scoping searches, an examination was made of the vocabulary associated with the literature of review and synthesis (literary warrant). A simple analytical framework -- Search, AppraisaL, Synthesis and Analysis (SALSA) -- was used to examine the main review types. Fourteen review types and associated methodologies were analysed against the SALSA framework, illustrating the inputs and processes of each review type. A description of the key characteristics is given, together with perceived strengths and weaknesses. A limited number of review types are currently utilized within the health information domain. Few review types possess prescribed and explicit methodologies and many fall short of being mutually exclusive. Notwithstanding such limitations, this typology provides a valuable reference point for those commissioning, conducting, supporting or interpreting reviews, both within health information and the wider health care domain.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project

            Background Identifying, developing, and testing implementation strategies are important goals of implementation science. However, these efforts have been complicated by the use of inconsistent language and inadequate descriptions of implementation strategies in the literature. The Expert Recommendations for Implementing Change (ERIC) study aimed to refine a published compilation of implementation strategy terms and definitions by systematically gathering input from a wide range of stakeholders with expertise in implementation science and clinical practice. Methods Purposive sampling was used to recruit a panel of experts in implementation and clinical practice who engaged in three rounds of a modified Delphi process to generate consensus on implementation strategies and definitions. The first and second rounds involved Web-based surveys soliciting comments on implementation strategy terms and definitions. After each round, iterative refinements were made based upon participant feedback. The third round involved a live polling and consensus process via a Web-based platform and conference call. Results Participants identified substantial concerns with 31% of the terms and/or definitions and suggested five additional strategies. Seventy-five percent of definitions from the originally published compilation of strategies were retained after voting. Ultimately, the expert panel reached consensus on a final compilation of 73 implementation strategies. Conclusions This research advances the field by improving the conceptual clarity, relevance, and comprehensiveness of implementation strategies that can be used in isolation or combination in implementation research and practice. Future phases of ERIC will focus on developing conceptually distinct categories of strategies as well as ratings for each strategy’s importance and feasibility. Next, the expert panel will recommend multifaceted strategies for hypothetical yet real-world scenarios that vary by sites’ endorsement of evidence-based programs and practices and the strength of contextual supports that surround the effort. Electronic supplementary material The online version of this article (doi:10.1186/s13012-015-0209-1) contains supplementary material, which is available to authorized users.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Beyond Adoption: A New Framework for Theorizing and Evaluating Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability of Health and Care Technologies

              Background Many promising technological innovations in health and social care are characterized by nonadoption or abandonment by individuals or by failed attempts to scale up locally, spread distantly, or sustain the innovation long term at the organization or system level. Objective Our objective was to produce an evidence-based, theory-informed, and pragmatic framework to help predict and evaluate the success of a technology-supported health or social care program. Methods The study had 2 parallel components: (1) secondary research (hermeneutic systematic review) to identify key domains, and (2) empirical case studies of technology implementation to explore, test, and refine these domains. We studied 6 technology-supported programs—video outpatient consultations, global positioning system tracking for cognitive impairment, pendant alarm services, remote biomarker monitoring for heart failure, care organizing software, and integrated case management via data sharing—using longitudinal ethnography and action research for up to 3 years across more than 20 organizations. Data were collected at micro level (individual technology users), meso level (organizational processes and systems), and macro level (national policy and wider context). Analysis and synthesis was aided by sociotechnically informed theories of individual, organizational, and system change. The draft framework was shared with colleagues who were introducing or evaluating other technology-supported health or care programs and refined in response to feedback. Results The literature review identified 28 previous technology implementation frameworks, of which 14 had taken a dynamic systems approach (including 2 integrative reviews of previous work). Our empirical dataset consisted of over 400 hours of ethnographic observation, 165 semistructured interviews, and 200 documents. The final nonadoption, abandonment, scale-up, spread, and sustainability (NASSS) framework included questions in 7 domains: the condition or illness, the technology, the value proposition, the adopter system (comprising professional staff, patient, and lay caregivers), the organization(s), the wider (institutional and societal) context, and the interaction and mutual adaptation between all these domains over time. Our empirical case studies raised a variety of challenges across all 7 domains, each classified as simple (straightforward, predictable, few components), complicated (multiple interacting components or issues), or complex (dynamic, unpredictable, not easily disaggregated into constituent components). Programs characterized by complicatedness proved difficult but not impossible to implement. Those characterized by complexity in multiple NASSS domains rarely, if ever, became mainstreamed. The framework showed promise when applied (both prospectively and retrospectively) to other programs. Conclusions Subject to further empirical testing, NASSS could be applied across a range of technological innovations in health and social care. It has several potential uses: (1) to inform the design of a new technology; (2) to identify technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) to plan the implementation, scale-up, or rollout of a technology program; and (4) to explain and learn from program failures.
                Bookmark

                Author and article information

                Contributors
                samantha.connolly@va.gov
                Journal
                J Technol Behav Sci
                J Technol Behav Sci
                Journal of Technology in Behavioral Science
                Springer International Publishing (Cham )
                2366-5963
                7 September 2020
                : 1-13
                Affiliations
                [1 ]GRID grid.410370.1, ISNI 0000 0004 4657 1992, Center for Healthcare Organization and Implementation Research, , VA Boston Healthcare System, ; Boston, MA USA
                [2 ]GRID grid.38142.3c, ISNI 000000041936754X, Department of Psychiatry, , Harvard Medical School, ; Boston, MA USA
                [3 ]GRID grid.414326.6, ISNI 0000 0001 0626 1381, Center for Healthcare Organization and Implementation Research, , Edith Nourse Rogers Memorial Veterans Hospital, ; Bedford, MA USA
                [4 ]GRID grid.267313.2, ISNI 0000 0000 9482 7121, Department of Population and Data Sciences, , University of Texas Southwestern Medical Center, ; Dallas, TX USA
                [5 ]GRID grid.168645.8, ISNI 0000 0001 0742 0364, Division of Health Informatics and Implementation Science, Department of Population and Quantitative Health Sciences, , University of Massachusetts Medical School, ; Worcester, MA USA
                [6 ]GRID grid.189504.1, ISNI 0000 0004 1936 7558, Department of Health Law, Policy, and Management, , Boston University School of Public Health, ; Boston, MA USA
                Author information
                http://orcid.org/0000-0002-1007-5626
                Article
                165
                10.1007/s41347-020-00165-4
                7476675
                32923580
                3632230d-2c7e-4bdb-bf7a-83a744d73d1b
                © This is a U.S. government work and not under copyright protection in the U.S.; foreign copyright protection may apply 2020

                This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.

                History
                : 30 June 2020
                : 10 August 2020
                : 31 August 2020
                Funding
                Funded by: FundRef http://dx.doi.org/10.13039/100000738, U.S. Department of Veterans Affairs;
                Award ID: VISN 1 Career Development Award
                Categories
                Article

                smartphone,app,mental health,implementation science
                smartphone, app, mental health, implementation science

                Comments

                Comment on this article