We use cookies.

To make your experience the best it can be, we use cookies and similar technologies on our site. We need your permission to allow these technologies, which will maximise browsing experience. For more information on how we use cookies and how to change your cookie settings, please see our cookies and privacy policy.



Please complete this short form to get in touch with a member of our team and we will get back to you as soon as we can.

Header image for the current page Change wisely – why we must get evaluation right to make best use of resources

Change wisely – why we must get evaluation right to make best use of resources

Share this page

Never has it been more important for us to use NHS resources wisely. The COVID-19 pandemic continues to exert pressure throughout the health and care system, on top of an already overstretched service. Tackling these demands is a balancing act – knowing what to change, what should stay the same, what new services are needed, and what could and should be stopped to deliver a more efficient, patient-centred service.

Innovation, by its nature, carries some risk – we can’t be sure in advance that something will work as we intended until we put it into practice. But we can’t let that prevent us from developing new ways of working that will help us meet growing demand.

That’s where effective evaluation comes in. Monitoring and measuring the impact of what we do allows us to make informed decisions when delivering change, but how we do it matters, particularly when it comes to complex transformation.

Meaningful measurement

The core purpose of an evaluation is to understand the value or worth of the intervention you are making, essentially to determine does it work? Practically speaking the focus is to understand how well the activities have been implemented, what ‘process’ has been followed and whether the activities have made a difference i.e. what has been the outcome/ impact?

The starting point is understanding what it is you want to achieve and setting up metrics that allow you to measure whether the change you are making is having the desired impact. This should tell you whether, and to what degree, the change is working and the contributing factors that are having the most impact – good or bad – on the outcomes. Evaluation investment should be commensurate with the scale of the project and should only measure meaningful data attributable to the interventions.

In developing your approach, it’s important to:

‘How’ you will evaluate is as important as ‘what and why’. At Arden & GEM, we are increasingly advocating for formative evaluation of programmes which is a much more agile approach to assessing, learning from and refining projects than traditional techniques. As part of the Global Digital Exemplar (GDE) programme independent evaluation, led by the University of Edinburgh, we were tasked with capturing data and emerging insights which was fed back into the programme as it progressed. This gave NHS England and NHS Improvement the opportunity to learn, tweak and optimise how the GDE programme was delivered to benefit GDE sites, ‘fast followers’ and latterly, ‘digital aspirants’. Adopting this approach allows organisations to be much more responsive to what the data is showing, rather than waiting until the end of a pilot/programme to assess its impact.

Continuous learning

Formative evaluation builds learning into your programme, encouraging regular reflection and tweaking. This means progressing with what works, understanding why the elements that aren’t working are falling short and identifying what needs to change. This may simply require a tweak here and there, or a targeted change in a key area, rather than going back to the drawing board or living with sub optimal outcomes.

For example, if early signs show your pilot is working well in only four out of six areas, analysing the data will help pinpoint the characteristics that make the difference. Sometimes this will be about softer elements such as the individuals involved and the mix of skills they bring to the programme rather than processes and systems – and some elements are easier to replicate than others. But knowledge is essential in determining whether you can make the necessary changes to achieve the outcomes you seek.

Taking action

If we want to make best use of resources, we need to be prepared to fail fast – and to pivot where we can, to achieve success. Instead of waiting until the end of a pilot to assess its impact, the formative approach gives us the flexibility to change tack where needed and put an early stop to aspects that aren’t delivering the outcomes expected. It also enables you to start to catalogue the key ingredients of success to inform replication and scale.

But this approach is more of a culture than a discipline. For us to truly benefit from this agile, responsive approach to evaluation, staff need to be allowed to make mistakes, have the confidence to make changes and share their learning. That way we can all work together to meet the growing demands in health and social care.

This blog was originally written for NHS Confederation and can be read here.

Picture of Sally Eason

Author: Sally Eason |

Sally is a transformation partner working with providers and commissioners across the NHS and local authorities to deliver programmes of change. Sally has 25 years’ experience of using different methodologies to address challenges across local health and social care economies. Her recent projects include working with providers and commissioners to design and support a programme to enable mental health patients to return to their local area, with recurrent annual savings of over £5 million pounds.