There is increasing interest in bringing researchers, service providers and policymakers
together in partnerships that seek to improve patient outcomes through the conduct
and
application of applied health research.
1
In England, this has been promoted by the national funder for health and care research
in the form of collaborations that seek to facilitate the use of research knowledge
by health
organizations and their participation in its production.
2
But understanding how this can be done, and learning from the experience of those
who
have tried to make an impact in this area, is often given little attention:
‘… evidence is lacking … on the impact … particularly in relation to the knowledge
mobilisation processes and practices adopted’.
3
Why is this so? ‘Research’ is often described as ‘evidence’, but ‘evidence’ is itself
a
contested term. Within health care, effectiveness research (does X work better than
Y?) and
its associated evidence hierarchy continues to dominate but is also increasingly challenged.
When the ‘intervention’ is complex and interacts with the context in which it is intended
to
operate, ‘evidence-based medicine’ may be less applicable, although this also depends
on the
paradigm of those who are considering evidence. The influence of professional training,
especially for clinicians, can lead to challenges in accepting alternative views of
evidence.
Viewing context as ‘a process rather than a place’
4
is a new concept if experience of research has been in controlling out context in
order
to test effectiveness.
Framing research evidence as being about what you do (X or Y?) and
how you do is helpful in considering what is meant by evidence, along with
the increasing emphasis on process evaluations alongside intervention studies to help
understand the role of context and other variables.
5
Yet, evidence about ‘how’, typically drawing on qualitative research, appears to remain
less visible, viewed by researchers as an add on or perceived to lack the same opportunities
for peer-reviewed publication available to effectiveness research.
6
It is also questionable whether research about ‘how’ actually gets used in practice
or
whether it is instead generating academic research that is itself difficult to apply.
Academics are increasingly attempting to ‘push’ research results into practice through
the
development of (supposedly) innovative dissemination methods such as toolkits, video,
etc.
7
Focus on research impact places increasing emphasis on this aspect of research,
although this may be contributing to research waste.
8
Viewing non-academics as ‘evidence users’ appears common but may not be helpful, as
it
reinforces the ‘knowing/doing’ gap.
Implementation research is subject to similar ‘push’ approaches, based on the assumption
that
it will ‘increase the rate at which research findings are implemented into
practice’.
2
Much implementation research is descriptive, however, with models criticized as
‘rudimentary and implicit forms of theory, often reducing complex relationships to
prescriptive checklists or stages’.
9
Increased emphasis on the use of theory in implementation science may well increase
its
rigour, but may not make it more applicable in practice. Research funding and academic
infrastructure are not supportive of the long-term development of such research, leading
to
calls for the ‘research enterprise’ in implementation science to be ‘redesigned’.
10
Despite ‘push’ being the predominant approach among the research community, it is
not
leading to ‘evidence’ being used in practice.
Few practitioners or organizations successfully ‘pull’ evidence from those who develop
it
(academics): ‘most health and care organisations aim to base decisions on the best
available evidence, but accessing and interpreting the right evidence at the right
time is
hard’.
11
Even if researchers were to make the evidence available in a timely manner, and in
an
appropriate format, formal research evidence is only one type of information used
in
decision-making. Managers also ‘value examples and experience of others, as well as
local information and intelligence’.
11
Despite attempts by research funders to be more responsive to health care and service
priorities, the timescale to get research funded and then carried out frustrates this
aim:
‘having good enough evidence at the right time trumps perfect research which arrives
too late for decision makers to use’.
11
Those funding research may need to encourage interim findings which are still robust
before study end, although this will challenge existing methods and approaches and
involve
working in the research ‘middle ground’.12 Another perspective on ‘pull’ is
provided by the developing literature on the absorptive capacity of organizations
which calls
for improved ‘coordination capacity’ if evidence from research is to be used in practice,
13
although this remains largely an academic approach rather than something that can
be
enacted in practice.
It is argued that co-producing research would be helpful in producing timely evidence.
Co-production with decisionmakers is more likely to inform subsequent decisions. There
is also
a human rights rationale for co-production with the public and service users,
14
but there remain structural challenges in implementing this and, importantly,
‘… the experiential knowledge of service users is rarely afforded equal value to
that of scientific/expert knowledge’.
15
So what can be done despite the structural and funding challenges? I propose some
practical
steps that can be taken, recognizing, however, that messy reality
15
means these cannot be ‘solutions’:
Have more conversations and interactions with a range of stakeholders outside academia.
15
Academics need to ‘get out more’, and there is great value in shadowing, informal
(i.e. non-research) observation and building links. Better understanding of, for
example, where and how the data researchers are analysing is generated can be
transformative, as they can see first hand the priorities of those generating them.
Have more conversations with other academic disciplines and get out of ‘disciplinary
silos’. Funders and researchers rarely draw on learning from different fields, nor
is
learning shared between disciplines and professions. Reviews of knowledge mobilization
approaches in health care have concluded that there is much to learn from other
disciplines, specifically management and organization studies.
13
There is a need for support for early career researchers ‘through
diverse, cross-disciplinary career pathways’,
12
currently lacking at the institutional level.
Do something together. The most effective collaboration comes when people from
different backgrounds work together on something with a shared objective, although
this
will inevitably involve some compromise on both sides.
Make the most of the research funding that we do have. We have a moral obligation
to
ensure that research funds invested are not wasted, even if we believe that the current
system requires reform. We should do more to ensure that funded research meets practice
priorities and challenges, considers implementability from the start and is, as far
as
possible, co-produced with those who will use it. Through peer review and membership
of
funding bodies, even individuals can make a difference here.
Stop wasting resources on more sophisticated ways to ‘push’ research findings into
practice. Basic good practice is often omitted; asking those who might use evidence
how
they access information is a simple (and usually ignored) approach, as is using existing
professional networks. We have a lot to learn from marketing and communications
approaches and can be slow to recognize the value of working with communications
professionals. Tailored approaches are more likely to be effective;
‘… researchers need to go to where their audience is, using many
platforms’.
11
We should be cautious about recommending more research on whether such actions make
any
difference. We need more understanding of what has worked, more learning from others
and a
more critical approach to the way we generate, select, apply and communicate evidence.
We need
to get what we already know into practice.