Manchester MST was implemented in 2014 in response to growing concern from Manchester council about the number of children going into care. The number was growing – and the direction of travel suggested this would continue. The council was also concerned about the poor outcomes for children who were in the care system. This was further reinforced by a particularly critical Ofsted report. The council considered a number of evidence-based interventions and found that MST fit their needs most closely. While selecting a programme, the council also wanted to ensure that the programme was having the anticipated and desired impact and so commissioned the internal Public Intelligence team to conduct an evaluation. While the provider (Action for Children) and the MST programme themselves both collected information, it was felt that this was too qualitative and that it could easily be biased, given the incentives of both parties.
Part of our Early intervention into action series of case studies on innovation and evaluation
Manchester had too many children in care and the direction of travel was going the wrong way. The councillors and senior council leaders started to focus on this around 2010/11. Their concerns were reinforced by an Ofsted inspection in 2014 which found that too many children were going into care, and that children in care were having particularly poor outcomes, even when compared to other children in care rather than only to children not in care. This became a strategic priority, both because of the poor outcomes and because of the increasing amount of funding going into supporting these families and young people.
The council considered investing in additional resources in the existing system (eg more social workers) but felt that there was a need for something different, with a different methodology and skill-set in the staff and a different approach to working with the young people. Officers looked at a number of interventions including MST, and felt that MST was the best fit for the area of concern that they were trying to address.
MST is an intensive family and community-based intervention for children and young people aged 11-17 who are at risk of out-of-home placement, either in care or custody, due to their offending or severe behaviour problems. The service works primarily with children who are living in a home environment with a primary caregiver. While children in a long-term out-of-home placement are not eligible for the programme, the service does work with children in care if an imminent return home is planned. In these cases, the role of MST is to support a successful return.
MST is a manualised programme, with a duration of each intervention of usually around four to five months. It is run across Manchester, and children receive the programme once an ‘at risk of care’ panel decides that this is the most appropriate intervention from a number of interventions available locally. It is commissioned and run by the local authority. The contract ran initially for two years, from April 2014 to March 2016, and has been extended.
What worked well?
MST has established itself as a core part of the resources that can be used to help children and young people avoid care. It occupies a specific and necessary space in the wider array of services that can be brought to bear when a child is at the edge of care or has just gone into care. Having it as one of the options has also brought clarity to the types of challenges that children, young people and families face, as having multiple options helps to guide social workers and others to be more specific about what type of interventions they are looking for in each individual case.
What is hard or challenging?
The steering group is starting to thin down, as partners think it is going well and evaluation is positive, and therefore that they don’t need to come along to talk about it. So the programme team is thinking about how to revive that and make it more engaging.
Because MST is manualised, people can only do what the programme says they can do, rather than being part of a wider design process, but they do want to interact more with the steering group. It has been around in the UK, so it is not something new, and it’s possible to recruit good people who have stayed around and are good quality. Ups and downs are caused by other staff issues: sickness, turnover, and so on.
A final challenge comes when scrutiny challenges the costs.
How are these challenges being overcome or addressed?
The programme team are looking at how to bring strategic questions to the table at the steering group to give it more purpose and focus. The staffing issue is something that sits beyond their reach and remit.
What are the key lessons?
When implementing something new it is powerful to locate it in the wider system. This does sometimes lead to the need for new infrastructure, such as the ‘at risk of care’ panel that now helps decide which is the best intervention for a family.
About the evaluation
A ‘before and after’ method is used to compare clients’ outcomes during a period immediately before their MST intervention with an equivalent period of the same length, post-intervention period. This method has a number of limitations. First, ethical and logistical issues prevent accurate identification of a control group, which means that it is not possible to say with any degree of certainty what clients’ outcomes would have been had they not participated in the intervention. This analysis, therefore, cannot address the issue of causality – i.e., whether observed changes in clients’ behaviour occur as a result of the MST intervention or whether they are due to other factors. The amount to which the improvements in outcomes that can be directly attributed to MST is difficult to accurately quantify without a control group. However, given that a requirement of MST providers is that no other intervention services work with the children at the same time supports the case for attributing much of the success seen in the changes in outcomes to MST. The assumption used in the previous evaluation that MST contributes 50% of the improvement in outcomes therefore seems as reasonable assumption to continue to use. The aim of this analysis, therefore, is to evaluate the extent to which the aims of the MST programme are being met, without attempting to attribute causality. The interim evaluation is therefore designed to provide some initial insights into how clients’ outcomes have changed over the two-year period, for some this will be (approximately) a 16-month period (six months pre-intervention, four months intervention and six months post-intervention), while for others it will be 28 months. A cost-benefit analysis is also provided in the interim evaluation.
What were the conditions of the evaluation?
About 18 months after MST was implemented the Public Intelligence team was commissioned by the council to do an evaluation against three key areas: children going into care, educational outcomes (including attendance) and interaction with the police including call-outs to homes. The evaluation made use of all 86 referrals made to MST by then. The 42 young people who had come onto the programme were the sample and the 44 remaining children were the comparison group, although it should be noted that it is likely that those not accepted on the programme are likely to be systematically different to those who were accepted, limiting the conclusions which can be drawn when comparing outcomes for these two groups.
No qualitative data was used in the evaluation as this is being collected in two ways already: the provider, Action for Children, is collecting qualitative feedback from families and young people and the national MST service is collecting data from therapists.
What resources did the evaluation require?
During the first evaluation, it took a senior team member two to three weeks to do all the data-matching and analysis, as the process was about matching data from school records, social care data and police data. As this is done using date of birth and name, and these are often wrong or missing, it required almost line-by-line matching. However, now that the original 86 children have been matched, the work is much easier, as it is only about matching new children to their data in other systems.
There is some appetite to expand the measures that are being looked at to include things like being not in employment, education or training (NEET), particularly given the age range of the young people involved and their general reluctance to go back into mainstream schooling.
What changes or outcomes were observed?
- Reduction in children going into care
- Reduction in police interaction
What is hard or challenging about conducting an evaluation?
- It was time-consuming to do the data matching between systems.
- It is also hard to find education outcomes for children post-16, as there are multiple routes they can take and there is no one central data system that can be accessed to match the cohort of children in MST with their educational status, participation and outcomes.
- It is hard to prove a counterfactual – ie whether these children would have had the same outcomes if they had not received MST but some other less-resource-intensive, less-expensive programme.
How are these challenges overcome or addressed?
- The Public Intelligence team dedicated resource to the work and were able to do the matching.
- The team have not yet determined how best to respond to this challenge.
- The evaluation explicitly used as a comparator group children who were referred to MST but not taken on. This allows for some level of understanding of what happens if a child or family has a different intervention and/or no intervention (though, as noted above, these young people are likely to be systematically different to those who were accepted on the programme).
What are the key lessons about conducting evaluation?
Nothing specific: it was relatively straightforward in design – it just took time to execute, given the data-matching requirements. If the sample size had been significantly larger this may have been a very big barrier, but as the sample size was relatively small and the number of data sources contained, it was manageable within internal resources.
What effect did the evaluation or its results have?
The evaluation has proved useful in maintaining political leaders’ commitment to the investment and reassuring service leads that MST is value for money, even if it feels expensive at a per-child level.