Skip navigation
Blog

How local authorities approach evaluation: Lessons from Camden and Wandsworth

Published

4 Sep 2018

Tom McBride introduces two new case studies, highlighting how local authorities approach evaluation, and how they can start to tackle the challenges that arise.

Our mission is to ensure effective early intervention is available to all children and young people at risk of poor outcomes, and we see generating and using good quality evidence of the effectiveness of services as a crucial part of this. We therefore want to support a step-change in the quantity and quality of evaluation conducted in the UK.

Part of the solution must be building capacity for evaluation locally. It is important that those involved in overseeing or delivering early intervention services locally have the confidence and tools they need to generate their own evidence or commission evaluation of the services they are delivering. This can be difficult to find time for, up against the pressures of service delivery, but those working locally have a crucial role to play as active participants in building the evidence for early intervention in the UK as a whole.

At EIF we are working to support local capacity to generate useful evidence, and will be continuing to offer advice and opportunities to partner with us on this crucial area in the months ahead. In 2017, we launched ‘Early intervention into action’, a set of seven case studies of local authorities at different points in the evaluation cycle, as well as our five ‘rules of thumb’ for planning good quality evaluations. And earlier this year we published our ‘Six common pitfalls’ guidance for evaluators, which highlights the quality issues we see most often in the evaluation of programmes and interventions.

To add to this, we are keen to understand how local authorities approach the task of evaluating their early intervention and related services, and the barriers and challenges they run into along the way. We recently spent time with two local authorities who are grappling with the question of how to understand the effectiveness of their services.

The resulting case studies provide a useful insight into the issues local authorities face in evaluating early intervention services, and specific ideas on how to address some of the challenges which arise. Despite the many differences between the two cases, what unites them is a commitment to using evidence and evaluation to improve services for children and families.

Camden invited us to look at their approach to evaluating the effectiveness of their Family Group Conferencing (FGC) offer. Designed to support families to take the lead in decision-making about the support they need, in many ways FGC is a ‘classic’ discrete intervention, which in theory means it would be amenable to standard evaluation techniques. Much of focus in Camden is on understanding how to define a set of outcome measures for FGC, given the diversity of needs that families have at the point they enter the programme; how you measure and attribute change, when there are a relatively small number of self-selecting families involved; and what is realistic to expect from a reasonably low-intensity intervention. We had the privilege of meeting families who had participated in FGC, and gained some valuable insights from the opportunity to understand what the process had meant to them.

The Wandsworth case is completely different. We were invited to look at THRIVE Wandsworth, their new early help strategy designed to provide a universal front door and to build strength and resilience through a whole-family approach, ultimately reducing pressure and spend on children’s social care. Given the range of services covered and the ambitions around culture change incorporated in the strategy, evaluating impact is complex. During our visit, we discussed a range of issues including how to define success beyond reducing spend on children’s social care, the timeframe over which it would be reasonable to measure change, and how to measure the impact of the strategy on ‘softer’ outcomes, such as closer working across services and a more shared ethos.

We are extremely grateful to Camden and Wandsworth for participating in this process, and for allowing us to share the details of our feedback to them publicly. We are interested in producing further case studies examples of local authorities who are trying to measure impact; if you are interested in working with us, please do get in touch.

About the author

Tom McBride

Tom is director of evidence at EIF.