Article Text

Download PDFPDF

Communication
Complex programme evaluation of a ‘new care model’ vanguard: a shared commitment to quality improvement in an integrated health and care context
  1. Sally Fowler Davis1,2,
  2. Sebastian Hinde3,
  3. Steven Ariss4
  1. 1College of Health, Wellbeing and Life Sciences, Sheffield Hallam University, Sheffield, UK
  2. 2CCA Care Group, Sheffield Teaching Hospitals NHS Foundation Trust, Sheffield, UK
  3. 3Centre for Health Economics, York University, Heslington, UK
  4. 4ScHARR, University of Sheffield, Sheffield, UK
  1. Correspondence to Dr Sally Fowler Davis; s.fowler-davis{at}shu.ac.uk

Abstract

NHS vanguards, under-pressure to perform, required better contracting and data management arrangements with evaluation teams, to ensure that integrated service outcomes could be reported effectively. This communication reflects the experience of evaluating an NHS vanguard and suggests how academic teams can improve capacity for complex programme evaluation of rapid improvements in integrated services. This should be based on a shared commitment to data collection and management. Also, robust knowledge exchange processes can enable systems change and sustainability. The identifying features of the particular site have been withheld.

  • evaluation
  • NHS vanguard
  • healthcare
  • integrated care
http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Strengths and limitations of this study

  • The identification of methodological challenges in complex programme evaluation.

  • Learning presented from a single vanguard site.

  • Recommendations for improved outcomes and capacity for system-level evaluation.

Background

The closer integration of healthcare and social care has been a policy goal of successive UK governments for over 40 years, who in common with most advanced western countries face the challenge of an ageing population with a range of health and care needs. In 2014/15 NHS England created funding for 50 new care model ‘Vanguard’ visions of best practice in the NHS.1 The sites were selected on the premise that a number of areas in England would spearhead the NHS five year forward view2 and build local quality improvement leadership capacity. While the funding for the 3-year projects was from central government, the inspiration for the planning and implementation of each project was based on local priorities. Stakeholders from the health economy contributed to the plans, with local decision makers and practitioners, aiming to work together to achieve system-level improvements.3 An important objective of the programme was to design new care models that could be replicated quickly across the NHS. Local implementation4 was based on the idea that health communities would know and understand the opportunities for health improvement and prevention5 and make a radical step change in systems re-design.6 In most cases the funding and accountability arrangements and separate regulatory regimes focused on the performance of individual organisations but not the system as a whole.7

The vanguard planned three separate service initiatives as ‘rapid improvement cycles’ over 18 months (the evaluation team was not involved in the first stage). The complexity of the change was compounded by organisations collaborating without the benefit of shared governance arrangements. There was a lack of clarity and accountability in decision-making processes with different tiers of management and some detachment at board level.8 Organisational gatekeeping of service-delivery teams and short-term contracts for practitioners made planning for operational delivery very problematic and local clinical leadership was reduced due to a significant reduction and delayed funding from NHS England. These factors contributed to slower than planned progress towards integrated services operating in primary care.

The aim of evaluation was to generate an iterative programme theory to explain the vanguard improvement activity across a health and care system, to systematically report system change methods and cost savings. In most cases (nationally) the evaluation methods included complex, theory-led process evaluations9–11 based around the commissioning processes and the multidisciplinary teams. Methods also included some health economic evaluation to retrospectively assess the cost of delivery and the value of the service, measured against previous service provision. This paper discusses the retrospective learning from one evaluation process in one NHS vanguard site.

Vanguard evaluation

Evaluation processes were variously negotiated by each vanguard site in line with NHS England guidance on the evaluation design, with the ultimate goal of comparing results and finding across all sites.12 Programme teams contracted with evaluation teams to enable the generation of quasi-scientific correlations and testing of generative causal assumptions, to establish effectiveness of intervention in a ‘real-world’ context. Some £60 million was allocated to the evaluations of the 50 vanguards reporting in March 2018, by which time NHS England expected individual vanguards to be sustainable without further national funding for transformation.13 They were asked to resist the pressure to provide positive signs of impact, at the expense of learning14 but the urgency of the demand for results grew as the programme progressed.

Complex programme evaluation included economic evaluation but also sought to identify a range of active ingredients and disruptive ‘innovations’. Local imperatives were identified across the health and care economy, for example, re-designing community services (nursing and allied health professionals) to work closely with general practice and achieve better patient outcomes. Vanguards came under pressure to report measurable improvements through generating organisational case study of the changes in practice.15 The co-design of the evaluation was therefore an important element of the vanguard, to enable access to systematically collected and collated data. This involved describing and explaining the complex processes and the effects of changes within a primary care system.

Evaluation processes

The evaluation was registered as a service evaluation with the research office of the participating Healthcare Trust.

Utilisation-focused methods16 aim to meet the demand for the ‘social‐constructivist’ approaches and reflected the needs of implementation processes in healthcare. To evaluate effectively, there is a need for a full understanding of evaluation’s nature, purposes and concepts17 and to establish a working relationship and understanding of the priorities and needs for data and knowledge within the healthcare provider group. To this end several qualitative data sets were collected and collated between October 2016 and November 2017 capturing the views and values of those involved in planning and delivery of the integrated community service model. The aim was to develop, test and refine a programme theory that supported implementation18 allowing managers to identify who the primary end users of the evaluation findings might be, what evidence they require and how this could be formed into a sustained value proposition across the system.

The requirement to implement a local evaluation was a condition of funding the vanguard. The evaluation team were brought together because of their experience and willingness to work with a health system and collaborate in the development of evaluation objectives for the vanguard. Key principles of the evaluation16 19 included a commitment to the usefulness of evaluation evidence16 and the development of effective, trusting relationships with the key stakeholders.19 It can take time for service providers to shift towards collaborative working and finding equilibrium on the trust/control nexus (at individual and organisational levels20 21). Similar findings have emerged from other NHS England national innovation programmes; demonstrating the time required developing effective working relationships in complex evaluation situations (see NHS England healthcare technologies testbed programme).22

In this case, the relationship between Vanguard and evaluation team did not develop as hoped. While every attempt was made to access staff, patients and all available data, as is usually the case with implementation evaluation, there was limited capacity to use the findings in planning for Vanguard activity. Rapid improvement cycles were planned without use of the interim evaluation report data and without sufficient notice to coordinate evaluation findings with decision-making requirements. The evaluation team employed an ‘embedded’ evaluator to access and present data but the approach achieved limited success partly due to a lack of organisational capacity to generate patient outcome data and organise new working processes. The were limited mechanisms for the evaluation team and the Vanugard team to meet in order to make changes to contractual arrangements or to enable the prioritisation of data collection, or to share the continuously share local knowledge that could feed into the analysis and reporting of service outcomes.

Difficulties with data

Evaluation involves a shared commitment to the normalisation of data collection, visualisation and analysis, shared between partners. The vanguard, working closely to the specification of NHS England, sought to meet data required by the national programme that demonstrated rapid, large-scale changes in process-performance indicators at a system-level, that is, reduction in attendances at emergency departments. This was at the expense of data collection and analysis that could be used for operational planning. Key stakeholders were unable to agree on a set of outcome metrics that best reflected population health and fitted with the programme theories of change. The failure to discriminate between the different levels of data need for a service transformation can result in a constant 'flux' or extreme change. In this case it proved to make implementation too difficult and contributed to the premature closure of interventions.

Qualitative data provided an important early insight into the adoption of new processes and systems at service level, for example, one member of the ‘Vanguard Delivery Board’ contributed through in-depth interviews;

‘We’ve got a cross-organisational group that comes together. So we do share all of those metrics. But a lot of them they’re quite complicated. …So one of the ones that were shared with us was around… the percentage of beds where people are in them who could potentially be somewhere else. But the definition that’s used wouldn’t be the definition, for me getting that headline data is great, but once you start querying that data you realise that your understanding of what that means isn’t actually what’s being collected. … So it’s that understanding of each other’s organisation, and what those metrics mean, not actually making assumptions based on what you think things mean’. (Vanguard Stakeholder).

Data collection and management needed to be strategically focused on consistent interventions that could be evaluated effectively. In this particular case, the lack of capacity in frontline teams to identify and consistently collect population outcomes data meant that vanguard metrics did not reflect the additional benefit of the integrated service to patients. In many cases the metrics that were being collected were not fit for purpose and failed to show how health outcomes were being achieved.

The critical challenge of building capacity to collect and collate data and use it analytically to inform commissioning decisions was central to deliver future services. Data needed to be accessible to stakeholders, paper-free at the point of care and connected to other services and systems.23 However, many vanguards have been slow to collect and collate anything other than service-level data. Population health outcomes remain elusive, in spite of the original commitment, showing the impact that changes would have on patients, staff and the wider population.

Representing the return on investment

A mixed method economic evaluation aligned to the national requirements started with the broad remit of exploring the costs and health-related impact of the new model of care, compared with current practice. However, in common with a number of other vanguards,24 challenges in accessing meaningful patient outcome data within and a time-limited period with repeated rapid re-designs significantly hampered the analysis.

The economic evaluation considered the cost of the vanguard programme alongside a time series analysis of secondary care (hospital) activity (eg, emergency bed days, length of stay and admissions). It demonstrated that there was no observed impact, negative or positive, that could be directly associated with the service re-design over the time period analysed. The insufficient evidence of a return on investment was the only element of the evaluation that the programme team used to inform the on-going commissioning decisions, unfortunately leading to a discontinuation of the integrated team. The inability to offset the cost of service provision and no evidence of improvement on patient health meant that the vanguard was unable to continue beyond the programme period.

The economic evaluation sought a comparator site using a synthetic comparator25 a sample area, assessing secondary care costs in an attempt to show short-term benefit. The intervention was defined as integrated community services and the causality of the effect of the integrated team needed to be disentangled from other common causes of variation such as winter influenza or workforce changes. A good understanding of the systems’ influences is critical to programme evaluations26 but this requires considerable embedded knowledge and understanding to be shared across the programme and evaluation teams.

The true cost of the intervention is an important element of the economic analysis.26 In many of the vanguard programmes, the funding of the service was made up of a combination of central funding a locally provided in-kind provision and the implications are important for the evaluation. For example, the redeployment of staff to the new programme is typically very challenging to quantify, even in the short term. Other factors included the incremental cost of the new service, the additional national funding and critically the ability of the programme management team to understand and confirm the costs. The marginal cost of funding the programme long-term, with many of the in-kind services being provided alongside activities, may not be sustainable on a permanent basis.

Workforce challenges

Short-term workforce changes that are through secondments and short-term appointments go to the heart of the sustainability of what the programme is going to achieve. The National Audit Office has recently recognised that there were ‘missed opportunities’13 for the required depth and scale of transformation across the system, particularly in relation to the delivery that achieved economic sustainability and full value for money of the programme. Service outcomes related to existing staff in short-term posts and variation in hospital activity were unlikely to be a good indicator of the benefits achieved through integrated team practice and the long-term patient health and well-being.

While ideally any evaluation would incorporate a lifetime consideration of the health of the patient, and other relevant social outcomes27 such time-intensive research was clearly not possible in the vanguard. One insight was fed back to reflect the way that individual practitioners approached team practices;

"… I think it’s quite difficult for individual organisations to let go of control. So whilst I think at the moment we’ve got people working in an integrated office, so out of one office, I wouldn’t yet say we’ve managed to get an integrated team…we’re on that journey, and we are working towards becoming one team. But culturally and how everybody works, and how all the different organisations work, and what that looks like is quite difficult. (Vanguard Stakeholder)

Improving capacity for complex programme evaluation

Taking the knowledge and experience into account, this reflection identifies four initial areas for improvement in the planning for academic evaluation. The purpose is to improve reporting on policy-driven transformation programmes.

Increase access to integrated services

Evaluation teams require specific access to managers and the interdisciplinary workforce.28 Consensus on general practitioners’ views and perceptions of the systems change are required to identify the variation in choices and priorities for integrated working.20 Interdisciplinary working remains under-developed in primary care and evaluation could highlight good practice, for example enabling the most effective local improvements, based on those designed to provide rapid access.29 Clinical leadership is often under considerable operational pressure to demonstrate success of the integration of professional practices.30 So, a commitment to allowing patient-facing teams to share experience and express priorities for integration is a core evaluation requirement. The use of qualitative data to represent ‘telling cases’ is critical to show how systems leadership has led to greater integration.

Develop contractual arrangements

Evaluations designed to inform innovations in service delivery need pre-established stages and clear reporting requirements. While evaluations can be rapid-cycle and feedback can be informal in nature, there is a requirement to maintain a timetable of activities within a relatively stable service delivery model, to allow for setting up data collection processes and to analyse and interpret these data. The evaluation team is often able to become an additional resource through the sharing of research evidence and comparative experience from other health context. This model of evaluation practice needs to be introduced and contracted carefully, in such a way as to make clear the purpose and value of the partnership, of site visits and observations.31 Evaluation planning should include opportunities for organisational development through engaging community and professional stakeholders and formative and summative evaluation.23

Economic evaluation

Evaluation teams require programme leaders to co-design the model of health economics, recognising not only the return on investment but also the value of the learning and leadership within the system. The increasing value on social justice in economic terms is a significant test of the local commitment to the cost and return on sustained organisational learning.32 Shared understandings of the metrics by which population health improvements are being assessed are now critical. They serve to challenge assumptions that secondary care metrics, that is emergency admission data, is satisfactory. The design of the economic evaluation needs to reflect the original values associated with the shared quality improvement goal which in this case had three facets: improving care, managing demand and reducing hospital admissions. The attempt to identify value and attribute costs at systems level is required before integrated care services can be sustainably commissioned.29

Building capacity for evidence-based change

Complex evaluation seeks to deliver the support for decision-making for services in ‘practice-based’ commissioning33 and NHS England supported vanguards to investigate their concerns about the level of unplanned admissions. A range of interventions could be effective in reducing these,34 with a view to re-designing care and promoting health improvement. Routinely collected metrics may be used to assess the quality and effectiveness of care provision and the choice of metric needs to be a careful consideration in relation to quality and cost impact. Vanguard evaluation enables an evidence-based approach to improvement but just as health professionals need a full understanding of the conditions they have to treat, academics undertaking evaluation need as full an understanding of the process as possible.17 The engagement with the particular health system and a commitment to share the findings with stakeholders requires time and capacity to achieve the best outcomes for selected patient populations.35

Conclusion

Complex programme evaluation was a requirement of each NHS vanguard sites, designed relative to the local improvements that were planned with services and across health and care systems. An academic team was recruited to increase capacity, insight and report findings of a local systems transformation. Improved evaluation processes may be needed to showcase the value of the investment in ‘new ways of working’ and to sustain system outputs. Better evaluation outcomes would be achieved with (a) increased access to the frontline services and the process of integration, (b) contractual processes that enable evaluation teams to share interim findings and engage with complex dilemmas across the system, (c) clarification on a range of quality outcome metrics that would inform an economic evaluation thus helping commissioning to resist the considerable pressure to view short-term cost savings and (d) capacity building associated with the relevant research evidence to support local planning. National evaluation is currently being undertaken to identify the sustained changes that have taken place.

Acknowledgments

NIHR CLAHRC Yorkshire and Humber. www.clahrc-yh.nihr.ac.uk funded the open access publication.

References

Footnotes

  • Twitter @SallyFD11, @ariss_s

  • Contributors SFD drafted the communication with contributions from SH and SA on the content which was based on vanguard evaluation experience. All authors contributed to the final version and approved the submission.

  • Funding The evaluation research was funded by NHS England via the NHS vanguard site (details of site withheld) and the evaluation report was published separately in conjunction with the NHS vanguard programme. This communication reflects on some of the findings from this work.

  • Disclaimer The views expressed are those of the author(s), and not necessarily those of the NIHR or the Department of Health and Social Care.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval The evaluation was approved by the university ethics committee

  • Provenance and peer review Not commissioned; externally peer reviewed.