Editorial - (2008) Volume 16, Issue 6
Foundation Professor of Primary Care, School of Health and Social Care, University of Lincoln, UK
Although the actions of a particular clinician with an individual patient in the consultation are important for a successful outcome for that service user, quality improvement is rarely due to individual efforts. Instead, it is usually the sum of actions of individuals linked through their professional roles, the teams and organisations in which they work as well as the wider context of health care. These together constitute the system of care; the focus therefore has to be to develop the system in order to support individual clinical decisions at the front line of health care.[1] The series of editorial and discussion papers in this themed issue on ‘Organisational and Educational Interventions for Quality Improvement’ tackle how this might be best achieved. They address which organisational interventions improve care, why education underpins their effectiveness, how the organisational context provides the glue to link these and the contribution of the wider policy context to fuel change.
Wilson defines organisational interventions as ‘an attempt to improve the quality or cost effectiveness of care by changing who delivers care, how care is organised, or where care is provided’. In his editorial he identifies which components of these interventions work: in chronic disease management role, redesign, structured care, computerised decision support and prompting systems are key to producing an effect whereas simply reallocating tasks is ineffective; in nonscheduled primary care, role redesign as part of the development of new services is no worse than traditional general practice care.[2]
Learning is invariably a prerequisite for organisational change because for change to occur, healthcare staff, patients and carers need to learn to do things differently.Educational interventions andorganisational change therefore often coexist as components of a complex intervention. Wensing argues that combinations of organisational change and educational interventions need to be examined together.[3]
The internal organisational context provides the basis for organisational learning and capacity for change. This is the focus of Baeza and colleagues from case studies with five primary care trusts. They argue that primary care organisations need to have three key features to successfully implement service improvements. These include change leaders distributed throughout the organisation, a coherent change strategy and good working relationships between managers and clinical professional groups.[4]
Middleton observes how healthcare policy is a key external driver of change. He argues that many innovations in health care are not based on scientific evidence. Instead, they are introduced from a theoretical or policy perspective which as he observes are ‘driven by public expectation, government need and changing clinical perspectives’. Used as an example is the ‘Increasing Access to Psychological Therapies (IAPT)’ programme in the United Kingdom.[5] For those involved in such initiatives they are seen as a way of bringing about radical system change. However, it is often an act of faith that such initiatives have succeeded or when evaluation is considered it is as an afterthought or conducted so poorly that it does not tell us whether the change has led to more effective, efficient or safer care.
Blake provides another example which looks at those working in health services.[6] She distils her experience of organisational change to consider the health of health workers through ‘wellness initiatives’. She concludes that effective implementation requires change in organisational culture, through a combination of education, behaviour change interventions, needs-based facilities and services and strategies for developing supportive and health-promoting work environments. Disease management (so called ‘vertical’) programmes focusing on single diseases, such as heart disease, diabetes or chronic obstructive pulmonary disease in developed countries or AIDS in developing countries, aim to improve systems of care for individual diseases; but in doing so they can divert resources from enhancing primary care (‘horizontal’ systems) for the wider population. [7] Thomas and colleagues discuss how horizontal and vertical integration might be achieved through the mechanism of practice-based commissioning.[8] Another example of organisational change which aims to integrate vertical and horizontal systems has been the policy to introduce community matrons. Although an early evaluation has not shown reductions in hospital admission,[9] qualitative evidence presented by Brown and colleagues suggest that there may be other benefits that are important for patients and carers.[10]
Most organisational interventions are complex, involving two or more components which act independently or inter-dependently. Complex or multifaceted interventions are more likely to show a positive effect, partly because they are more likely to overcome barriers to change.[11] Educational interventions directed at changing knowledge, beliefs or behaviours in clinicians, patients or both, are examples of complex organisational interventions. Complex interventions need to be designed to succeed as well as evaluated to see whether and to what extent they work. The design needs to include a mixture of theory, modelling, evaluation and implementation. Although other frameworks do exist,[12] the Medical Research Framework for design and evaluation of complex interventions is arguably the best known and utilised.[13]
This framework has been used to develop many studies of complex interventions. Experience of using and applying the framework over almost two decades has meant that it has evolved from a linear to a more iterative process.[14] For an innovation in healthcare delivery to become securely established it has to become ‘normalised’ into everyday routine practice.[15] This involves a number of processes which have been described by Carl May as the Normalization Process Model. This includes understanding the change in practice in terms of the interaction between people and practice (‘interactional workability’), the relationship to existing knowledge and relationships (‘relational integration’), the new working patterns needed (‘skillset workability’) and the effect on the organisation (‘contextual integration’).[16]
May’s model does elegantly provide insights into how the ‘black box’ of the intervention can be better understood. However, this is often done retrospectively through the use of mixed methods such as questionnaires, individual and group interviews or ethnographic methods to look post-hoc at how a complex intervention worked within a trial.[17,18] What is less well understood or researched are effective methods for modelling an intervention. A lack of effect in trials of complex interventions is more likely to occur when insufficient attention is paid to the modelling phase.[19] Modelling can sometimes lead to complex interventions being abandoned.[20] The ‘normalisation’ model, although very useful for thinking about which aspects of the intervention should be considered to improve the chances of success, does not tell you how these can be modelled.
Some studies have employed traditional methods such as surveys, qualitative research and pilot studies for modelling, but such methods are often static;[21,22] although they can provide important information about how to improve the effectiveness of a complex intervention they do not directly improve care in themselves. Other study types, such as action research methods, can produce dynamic improvements in the processes or systems of care.[23] A handful of complex intervention modelling studies refer indirectly to the use of quality improvement methods, such as reflection [24] or process mapping,[19] but surprisingly, very few explicitly use quality improvement methods such as improvement teams or multi-organisation collaboratives, tools for describing processes (flowcharts, process maps, cause-effect diagrams), tools for collecting and analysing data (statistical process control charts), tools for redesign (critical care pathways) or rapid cycle experimentation and change (plan-do-study-act cycle) techniques.[25] In the case of general practice, where the aim of a complex intervention study is to diffuse an innovation into a number of small practice units with different organisational, cultural and contextual characteristics, the utilisation of quality improvement collaboratives to model interventions in a number of settings appears particularly apposite. These methods seem perfectly placed to show how normalisation might be achieved and to derive an intervention capable of this.
There are a number of possible reasons for quality improvement methods being overlooked: they are often not themselves well evaluated;[26] they have only recently entered the research literature, so are perhaps less familiar to traditional researchers; systematic reviews of controlled trials of quality improvement collaboratives, though they show generally positive results, show moderate effect sizes overall[27] although this is critically dependent on factors such as team organisation as well as internal and external support for change.[28] Despite these potential barriers, quality improvement methods may provide a valuable resource for researchers contemplating the design and evaluation of complex organisational interventions. The methods provide an exceptional means of generating novel designs[29] as well as refining and improving these to maximise the likelihood of effective implementation within a trial as well as translation and spread beyond this.