Zooming in, zooming out: what the Early Years Transformation Academy tells us about system change and using evidence
EIF's assistant director of policy & practice Ben Lewing reflects on the conclusions of the independent evaluation of the Early Years Transformation Academy and invites different stakeholders involved in the design, delivery and evaluation of the Academy to share their insights.
Last month we published the independent evaluation of the Early Years Transformation Academy, marking an important step in our journey to learn how the use of evidence can support local improvements. The Academy, with its cohort of five teams of local leaders, is one of our most ambitious and intensive approaches to engaging with local areas on evidence use and system planning – so far at least.
You can read the full evaluation, along with an EIF foreword that explains how the design of the Academy was built around some key principles for supporting the use of evidence, such as understanding context, building the evidence ecosystem, and testing and learning.
The evaluation offers learning on the practical delivery of the Academy, particularly on sufficient time, engagement, and sequencing. It probes the fit between the pace of the programme, and team and individual needs. It explores the strengths and areas for development in the programme content, including whether the balance was right between practical tools and challenging mindsets; whether the contextual analysis should take greater account of community factors, such as housing, transport or employment; and the need for a greater focus on community involvement in the programme design.
Crucially, on evidence-use, the evaluation reinforces the importance of a broad understanding of evidence that reaches beyond the reassuring ‘proof’ of high-quality intervention impact evaluation into the world of evidence about community needs, family experience, workforce skills, intervention implementation quality, and practitioner knowledge. Although these forms of evidence do not provide causal evidence of impact, they are critical to intervention improvement and decisions on how resources are deployed. They are also particularly important ways of understanding real-world and complex system effectiveness — the conditions in which it is more challenging to use impact studies.
Crucially on evidence-use, the evaluation reinforces the importance of a broad understanding of evidence that reaches beyond the reassuring ‘proof’ of high-quality intervention impact evaluation into the world of evidence about community needs, family experience, workforce skills, intervention implementation quality, and practitioner knowledge.
We have asked some of the key stakeholders involved in the Academy to share their personal reflections about the impact and learning from the Academy approach to ‘mobilising knowledge’ to better integrate services and improve support for families:
- Jane Lewis, who led the independent evaluation, and EIF’s Max Stanford, who commissioned it, both reflect on making sense of complexity.
- Scott Jones from Dudley’s EYTA Team reflects on the importance of context and relationships, a theme also picked up by Jo Davidson from the Staff College, who was part of the team who delivered the Academy.
- And finally, Steph Waddell, my fellow assistant director of policy & practice at EIF, considers the importance of multiple forms of evidence within systems.
This evaluation puts EIF in a much stronger position to move forward on supporting evidence-use as part of local work on the system challenges facing public services. We are already using the learning as part of our work programme, including refining our advice and support, developing clear knowledge mobilisation plans, and making evaluation a routine part of our projects. We look forward to going further still.
Assistant director, policy & practice at EIF
Jane Lewis from the Centre for Evidence and Implementation led the evaluation of the Academy. She reflects on how no single perspective is sufficient to understand a system and how multiple perspectives can help in understanding how systems may perpetuate social problems and inequities.
Our evaluation highlighted that the EYTA was a rich learning opportunity with many positive impacts for the participating teams. It also reinforced that changing systems is long-term work. You need to look both widely and deeply at local systems from multiple perspectives to really understand how they are experienced by local families, and – just as importantly – how they can ‘hold in place’ social problems and inequity.
I was struck by how often local service managers, however skilled and experienced, know only parts of their local system – so that the service mapping work done through the Academy illuminated unfamiliar parts of local systems and helped people to see how complex it is for local families to navigate.
It was also striking that the systems perspective gave people new insight into interdependencies and common interests across agencies and services. The example that rings in my mind is the connection between healthy pregnancies and additional educational support needs at age 5, which is well established in evidence, but not part of the day-to-day reality for people working in either maternal health or special educational needs services.
I came away with a renewed appreciation of how complex local systems are for families and practitioners, and an even stronger motivation to understand how we can take a systems-led approach to designing local services, centred on local families and their aspirations from the start, to tackle the multiple ways in which systems hold inequity in place.
Max Stanford, EIF’s head of early education and care, was responsible for overseeing the independent evaluation of EYTA. He reflects on the challenges of building a local ‘evidence ecosystem’.
I was impressed by how well the local EYTA teams had developed shared goals and strong partnerships, and how much this helped them when Covid hit, in terms of coordination, data-sharing, adaptation of service offers and joint communication with families.
But a major insight for me was the difficulty in building a local ‘evidence ecosystem’. Areas appeared to struggle to bring together different types of evidence (from population indicators to service performance and family feedback) and understand how they could be used in combination to develop, deliver and evaluate local change. Although some of this seemed to be about gaps in analytical expertise, more could be attributed to capacity issues. For EIF, I think this speaks to the need to tailor how we support local areas on a timeline that best suits their needs.
Scott Jones, head of family solutions for Dudley council and a member of Dudley’s EYTA team, reflects on the importance of context and relationships.
As an experienced head of service leading early help, parenting and edge-of-care services, partnership working has been central to my practice for nearly 30 years. However, it would take our work together in Dudley on the EYTA programme for me to truly understand the essence of systems leadership. A critical element of our progress was having a more holistic understanding of current services and outcomes, including important areas such as infant mortality, birth weight, excess weight and of course school readiness. Our video work with families who had recent experience of maternity and early years services in Dudley gave us new insights.
Although work over the past year shifted to the Covid-19 response, there remains a real connection between the leaders delivering the EYTA work in Dudley. We understand the wider needs of population in a much more sophisticated way. We are clear about who is most likely to need help to achieve good outcomes. We know what interventions we are going to implement and what our goals are. Having the time and space to take part in the Academy has been a unique opportunity, and I know our work together will have a long-lasting legacy.
Jo Davidson, principal of the Staff College, was a member of the EYTA Design Team. She reflects on how relationships are key in public services.
Now that we have the independent evaluation, it’s been fascinating to see not only what the Academy helped people do, but also how the Academy helped people feel. There’s been a sense of energy combined with learning and reflection since the inception of the Academy, and seeing that play out in terms of huge activity, combined with the space to think and try things out, is inspiring. This happened right the way up to and through Covid, and is something that in busy organisations is sometimes missed – but which leaders need to create the space to do, where they can.
It’s also great to see the sense of personal development and the positive relationships forged between people. In public services, relationships are all. So often we leap into superficial service change instead of really engaging with families and communities as equals about what really will work; or we fail to investigate or confront fundamental societal issues which are creating barriers and poor outcomes; or we ignore evidence because we think we know what will work. Once we are working as part of a set of trusted relationships all those things become much more expected, and easier to do. The combined thinking and experience of the Academy has illuminated so much and has highlighted a different model for supported change. It was a privilege to be part of it.
Steph Waddell, strategic lead for EIF’s work on knowledge mobilisation, reflects on different forms of evidence.
I’m interested in the questions the evaluation surfaces for us about how we think about evidence-use in complex systems, and the ways in which we seek to persuade our end-user audiences of the value of evidence to them. The EYTA was about system improvement, but it was also about the use of multiple forms of evidence within systems.
Moving forward, we need to be even clearer about the ways in which these things are meshed. We need to start by understanding what it is that our local audiences want to do or improve, and then support them to make use of the many sources of evidence and knowledge that can help them do this.
Yes, programme evaluation evidence is valuable, particularly if we can distil some common features of evidence-based programmes, but so is local needs data, the perspectives of local children and families, insight from local early-stage evaluation, and evidence about the implementation of new approaches or systems change, depending on the questions people want to be able to answer. It’s great that individual programme participants said they had a better understanding of the importance of some forms of evidence, but we want to move towards a place where evidence in its broadest sense is recognised as intrinsically valuable to work to improve the way systems gear to support children and families.