49
views
0
recommends
+1 Recommend
1 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      What we know about designing an effective improvement intervention (but too often fail to put into practice)

      other

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Intervening to change health system performance for the better It is temptingly easy to treat improvement interventions as if they are drugs—technical, stable and uninfluenced by the environment in which they work. Doing so makes life so much easier for everyone. It allows improvement practitioners to plan their work with a high degree of certainty, funders to be confident that they know what they are buying and evaluators to focus on what really matters—whether or not ‘it’ works. But of course most people know that life is not as simple as that. Experienced improvers have long recognised that interventions—the specific tools and activities introduced into a healthcare system with the aim of changing its performance for the better1—flex and morph. Clever improvers watch and describe how this happens. Even more clever improvers plan and actively manage the process in a way that optimises the impact of the improvement initiative. The challenge is that while most improvers (the authors included) appreciate the importance of carefully designing an improvement intervention, they (we) rarely do so in a sufficiently clever way. In this article, we describe our attempts as an experienced team of practitioners, improvers, commissioners and evaluators to design an effective intervention to improve the safety of people living in care homes in England. We highlight how the design of the intervention, as described in the original grant proposal, changed significantly throughout the initiative. We outline how the changes that were made resulted in a more effective intervention but how our failure to design a better intervention from the start reduced the overall impact of the project. Drawing on the rapidly expanding literature in the field and our own experience, we reflect on what we would do differently if we could have our time again. A practical case study—an initiative to improve the safety of people living in care homes A growing number of vulnerable older people are living in care homes and are at increased risk of preventable harm. We carried out a safety improvement programme with a linked participatory multimethod evaluation2 in care homes in the south east of England. Ninety homes were recruited in four separate cohorts over a 2-year period. Our aim was to reduce the prevalence of three of the most common safety events in the sector—falls, pressure ulcers and urinary tract infections—and thereby to reduce unnecessary attendances at emergency departments and admissions to hospital. In the original proposal submitted to the funding body, we described a multifaceted intervention comprising three main elements: The measurement and benchmarking of (i) the prevalence of the target safety incidents using a nationally designed tool called the NHS Safety Thermometer3 and (ii) rates of emergency department attendances and hospital admissions using routinely collected data. Training in quality improvement methods provided initially by a team of NHS improvement advisors and then, using a ‘train the trainer’ model, by practitioners working with or in the care homes. The use of a specially adapted version of the Manchester Patient Safety Framework,4 (Marshall M, de Silva D, Cruickshank L, et al. Understanding the safety culture of care homes; insights from the adaptation of a health service safety culture assessment tool for use in the care home sector (submitted to BMJ Qual Saf, August 2016), a formative assessment tool which provides insights into safety culture for frontline teams. The intervention was underpinned by a strong emphasis on support and shared learning using communities of practice and online resources facilitated by the improvement team. The programme theory hypothesised that the three main elements of the intervention (benchmarking, learning improvement skills and cultural awareness) would reduce the prevalence of safety events, that this would lead to a reduction in emergency department attendances and hospital admissions and that both outcomes would reduce system costs as well as improving the quality of care for residents. The intervention was co-designed by improvement researchers in the evaluation team, the improvement team in the local government body responsible for commissioning care home services and a senior manager of one of the participating care homes. The design was influenced by a combination of theory, published empirical evidence and the personal knowledge and experience of the commissioners and care home manager. We built in a 6-month preparatory period at the start of the programme, prior to implementing the intervention with the first cohort of care homes. This period was used to recruit staff, establish the project infrastructure and build relationships between the care homes and the improvement and evaluation teams. Only when the programme formally started did we begin to expose some of the deficiencies in the planned intervention. Table 1 describes the different components of the intervention, whether it was part of the original plan or introduced at a later stage, and, based on our participatory evaluation, how it was implemented and the extent to which it was used. Table 1 The original intervention and how it evolved Intervention component Original/added later Ways in which the component were implemented Extent to which component was used NHS Safety Thermometer (NHS designed and owned online tool for collecting process and outcomes data) Original Implemented with first cohort and offered to all of second cohort, then replaced by Safety Cross and Monthly Mapping tools (see below) 66% of first cohort homes tried the Safety Thermometer. About one-third input data Active involvement of staff, residents and relatives in sharing data and co-creating improvement solutions Original Staff initially slow to share data but became enthusiastic as project progressed. Residents and relatives hardly actively involved at all but project details and data displayed on public notice boards in most homes Fewer than 10% of first cohort homes shared Safety Thermometer data. Eighty per cent of homes used the Safety Cross and displayed this for staff, residents and families to see. Sixty per cent displayed graphs from the Monthly Mapping tool Training for care home staff in improvement methodologies Original Quality improvement training was provided initially by the NHS staff, then adapted and provided by the improvement team All homes took part in training. In first cohort, this was chiefly home managers but in subsequent cohorts some senior carers also attended Participants able to deliver the training to peers (train-the-trainer) Original Formal train-the-trainer model was not implemented though local advocates (‘champions’) were encouraged to roll out learning to others Champions were found to work well to spread learning informally Intervention toolkit containing a compendium of evidenced-based interventions for each of the domains of the Safety Thermometer Original Toolkit with worksheets and information sheets developed All homes received a hard copy and an online version. Unclear how much they were used by first cohort and then dropped as Safety Thermometer replaced by Safety Cross Safety culture assessed using the MaPSaF tool at three time points (before, during and after PROSPER), using the tool to understand and address barriers to change Original MaPSaF revised and tested in different ways with various cohorts Use not prioritised by the improvement team or by the homes. Small number of homes actively used it. Progressively more significant changes made to the tool for each cohort to make it more relevant Communities of practice Original Three community of practice events held throughout project Between a half and two-thirds of homes attended the events Improvement tools and case studies uploaded to resource tool for peer learning Original Knowledge hub set up and documents uploaded periodically, mainly copies of things sent by email 10% of homes signed up and none of them posted information Ongoing support from improvement team including meetings, visits and telephone conferences Original Facilitators visited homes with varied frequency. During the intensive phase, some homes were visited monthly and others every 3–4 months. Group telephone conferences were not used Some homes received regular support and others did not. Some homes reported that they had no contact with their allocated improvement adviser for 6 months ‘Safety Cross' for displaying information about monthly incidents replaced Safety Thermometer (see above) Addition Used from cohort two homes onwards then also rolled out to cohort one About 80% of homes reported using it ‘Monthly Mapping tool’ using graphs with monthly data to track changes over time and compare averages Addition All homes were invited to provide data about the monthly incidence of harms. From cohort three onwards, homes were given access to an online tool About 60% of homes provided some data. One-quarter used the tool regularly without prompting Provision of resources such as information posters, certificates of training, mirrors to view pressure ulcers and other tangible resources Addition Resources developed ad hoc Homes offered tools during community of practice and visits. Variable uptake depending on focus. Resources appeared to be highly appreciated Provision of additional training beyond improvement methods courses, such as training in infection control and pressure ulcers Addition Twenty-six training sessions run About 50% of homes participated Coordination with partner organisations in the NHS Addition Varied by geographical area Varied by geographical area Monthly newsletter Addition Sent monthly to participating homes Sixty per cent of home managers reported reading it Green=implemented as planned; Amber=partly implemented as planned; Red=not implemented as planned. MaPSaF, Manchester Patient Safety Framework. The evaluation found that four of the nine original components of the intervention were not implemented as planned and two were only partially implemented as planned. Only three of the nine were implemented in line with the original proposal. Five of the six new intervention components, designed and implemented while the initiative was taking place, were fully implemented. Qualitative evaluative data, collected using interviews, surveys and observations, demonstrated changes in the attitudes of frontline staff to safety and changes in their working practices. However, quantitative data suggested only small and variable changes of questionable statistical significance in the prevalence of safety incidents, and no impact on the background rising rates of emergency department attendances and hospital admissions. Success or failure? Perhaps we should not be too hard on ourselves. On the surface at least, our intervention was more sophisticated than that seen for most improvement projects.5 The multifaceted intervention had complementary measurement, educational and culture-change elements and was co-designed by a wide group of stakeholders, including a practitioner and experienced improvement science academics. We based the design on a reasonable programme theory and an explicit logic model. We recognised the need to adapt off-the-shelf tools to the local context and to build in a preparatory period prior to formally evaluating the intervention. And we purposefully chose a participatory and formative evaluation model to support a feedback cycle as the initiative progressed. As a project team, we thought that we had designed the original intervention thoughtfully and carefully but the findings of our evaluation suggested that we could have done a lot better. Reflecting towards the end of the programme, we considered a number of possible explanations: we did not put enough time and effort into designing the intervention; we designed a sound intervention which was not implemented sufficiently well or was implemented without an adequate understanding of the context and our expectations were naïve that an intervention at such an early stage of development would have a significant impact. We then revisited the literature to examine these hypotheses. What the literature suggests we should have done There is no shortage of increasingly sophisticated theory, empirical evidence and learned commentary that could have guided our design decisions. Much of the thinking about interventions is relatively new; a state-of-the-art review of improvement published in the Lancet more than 15 years ago made no specific reference to the ways in which interventions morph when applied in practice.6 In contrast, more recent international guidance on designing, conducting and writing up improvement projects highlights the importance of describing how improvement interventions change.7 In brief, a number of themes relating to the design of effective interventions are emerging in the literature. First, the importance of using theory (‘a chain of reasoning’) to optimise the design and effectiveness of interventions is highlighted.8 A commonsense rather than an overly academic approach to theory is being advocated as a way of reducing the risk of the ‘magical thinking’, which encourages improvers to use interventions that look superficially attractive but for which the mechanisms of action are unclear.8 9 Alongside the use of theory, there is a growing interest in the application of ‘design thinking’ as a strategy for ensuring that the problem has been clearly identified and a way of addressing complex problems in rapidly changing environments.10 Second, the importance of having an explicit method, such as the Institute for Healthcare Improvement's Model for improvement using Plan-Do-Study-Act cycles, is described and also understanding how to use the methods to their full potential.11 Third, there is a growing emphasis on the extent to which improvement interventions are social as well as technical in nature, and how their effectiveness is a consequence of a complex interaction between people, organisational structures and processes.12 13 Fourth, the literature describes how what people do (intervention), how they do it (implementation) and the wider environment (context) are interdependent and some people are suggesting that the traditional differentiation between this classic triad is no longer helpful.14 Fifth, there is a growing consensus that improvement efforts are being evaluated too early in their development and as a consequence are being judged unfairly as being ineffective.15 16 Instead, there are calls for interventions to be categorised according to the ‘degree of belief’ that they will work16 and how this belief becomes stronger as a project progresses. Interventions in the early ‘innovation’ phase should be evaluated using different methods from those in the later ‘testing’ or ‘spread’ phases. They may also have a different intent, for example, changes in behaviour may be seen as ‘success’ before measurable changes in outcome are achieved. Sixth, drawing on the expanding field of knowledge mobilisation,17 18 experts are calling for a more active process of co-design of improvement initiatives involving service users, practitioners and improvers, and also academics, with all of these stakeholders contributing to participatory models of evaluation.19 What we would do differently? Having reviewed the literature, we came to the conclusion that each of the post hoc hypotheses were reasonable explanations for what in the field of improvement were not uncommon results, but were nevertheless disappointing. In future, we will put more effort into designing the intervention from the very start. We will think through the design issues in sufficient detail to not only persuade the funder of the project but also to persuade ourselves that it will work in practice. We will describe a programme theory in greater detail based on a better understanding of the contextual factors which could impact on the feasibility and effectiveness of the initiative, and we will use design thinking to rigorously frame the problem from the start. We will work through in more detail and more systematically how to use current thinking about intervention design and its applicability to our project. We will build-in a similar or even longer preparatory period and will use that period to test and refine the intervention. We will not rely on a single senior care home manager to provide a practitioner view for the original proposal and we will seek a wide range of views from frontline staff and from care home residents in an inclusive and iterative way. We will not assume that the intervention can be implemented as described in the proposal and we will be more sensitive to the resource constraints under which the improvement team and the care homes are operating. If we do all of this, the outcome will almost certainly be better. Final reflections Improvement initiatives are sometimes planned on the hard high ground, but they are put into effect in the swampy lowlands.20 As we are more than aware, frontline practice is messy. And as we have described in this paper, it is never possible to do things perfectly and good improvers are always learning. But as the improvement movement matures, we are getting to the stage where we could and should be doing better. It needs to be seen as a professional rather than an amateur sport. The importance of understanding that improvement interventions are not like drugs or medical devices, and that flexibility needs to be built into their design and delivery, is uncontestable. But is it no longer acceptable to use the need for flexibility as an excuse for a lack of thought and planning. As improvement becomes more rigorous, perhaps improvement practitioners will be able to plan their work with a higher degree of certainty, funders will be more confident that they know what they are buying and evaluators will be able to focus on whether and how ‘it’ works.

          Related collections

          Most cited references11

          • Record: found
          • Abstract: found
          • Article: not found

          Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide.

          Without a complete published description of interventions, clinicians and patients cannot reliably implement interventions that are shown to be useful, and other researchers cannot replicate or build on research findings. The quality of description of interventions in publications, however, is remarkably poor. To improve the completeness of reporting, and ultimately the replicability, of interventions, an international group of experts and stakeholders developed the Template for Intervention Description and Replication (TIDieR) checklist and guide. The process involved a literature review for relevant checklists and research, a Delphi survey of an international panel of experts to guide item selection, and a face to face panel meeting. The resultant 12 item TIDieR checklist (brief name, why, what (materials), what (procedure), who provided, how, where, when and how much, tailoring, modifications, how well (planned), how well (actual)) is an extension of the CONSORT 2010 statement (item 5) and the SPIRIT 2013 statement (item 11). While the emphasis of the checklist is on trials, the guidance is intended to apply across all evaluative study designs. This paper presents the TIDieR checklist and guide, with an explanation and elaboration for each item, and examples of good reporting. The TIDieR checklist and guide should improve the reporting of interventions and make it easier for authors to structure accounts of their interventions, reviewers and editors to assess the descriptions, and readers to use the information.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The core of ‘design thinking’ and its application

            Kees Dorst (2011)
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The science of improvement.

              D Berwick (2008)
                Bookmark

                Author and article information

                Journal
                BMJ Qual Saf
                BMJ Qual Saf
                qhc
                bmjqs
                BMJ Quality & Safety
                BMJ Publishing Group (BMA House, Tavistock Square, London, WC1H 9JR )
                2044-5415
                2044-5423
                July 2017
                16 December 2016
                : 26
                : 7
                : 578-582
                Affiliations
                [1 ]Department of Primary Care and Population Health, University College London , London, UK
                [2 ]Evidence Centre , London, UK
                [3 ]Essex County Council , Chelmsford, UK
                [4 ]Research Department of Practice and Policy, UCL School of Pharmacy , London, UK
                Author notes
                [Correspondence to ] Professor Martin Marshall, Department of Primary Care and Population Health, University College London, London E201AS, UK; martin.marshall@ 123456ucl.ac.uk
                Article
                bmjqs-2016-006143
                10.1136/bmjqs-2016-006143
                5502247
                27986901
                43b411ad-b2aa-436a-9e3a-02a2e844d301
                Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

                This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/

                History
                : 26 September 2016
                : 23 October 2016
                : 28 October 2016
                Funding
                Funded by: Health Foundation, http://dx.doi.org/10.13039/501100000724;
                Categories
                1506
                Viewpoint
                Custom metadata
                unlocked

                Public health
                healthcare quality improvement,implementation science,nursing homes,quality improvement methodologies

                Comments

                Comment on this article

                scite_

                Similar content15

                Cited by28

                Most referenced authors339