Walsall: Planning and conducting an evaluation of a local reducing parental conflict intervention
This case example is part of EIF’s ongoing work to showcase how local areas are introducing change, adapting their strategies and changing the way they work to reduce parental conflict and improve outcomes for children.
This is Walsall’s story about planning and conducting an evaluation of a local reducing parental conflict intervention, as the last phase of a four-phase evaluation project. It is told by Georgina Atkins, Walsall’s parenting lead for early help, and Helen Burridge, research officer at the Early Intervention Foundation.
The information in this case study could be used to develop and conduct an evaluation of a reducing parental conflict intervention where there is opportunity to conduct pre/post/follow-up surveys with participants.
Our starting point
Walsall is a metropolitan borough located in the West Midlands. In Walsall, one in three children aged 16 years and under come from low-income families, which is higher than the national average of one in five. The high and increasing level of child poverty puts additional demands on our services in Walsall, including parental relationship support services.
To support families with parental conflict, we have a parenting course which originally was being delivered in person but shifted to being delivered online due to the pandemic. As we have changed the delivery method, we wanted to explore whether the course was still improving relationship and wellbeing outcomes for families – key outcomes we had identified in our theory of change and logic model.
We also wanted to gather further details on which families were accessing the course, how well the course was being implemented and what parents think are the benefits of taking part, testing assumptions that we had set out in our theory of change and logic model.
Therefore, as part of our reducing parental conflict evaluation support project with EIF, we set out to evaluate our online parenting course.
The action we took
To explore whether the course was improving relationship and wellbeing outcomes for parents, we decided to use a pre/post/follow-up study design to track changes over time. Since this was the first time we were evaluating the course, we conducted a pilot study, which is a smaller study usually conducted before a larger-scale study.
We used validated measurement tools to explore the change, as it means they have been carefully tested by researchers to make sure they produce reliable and accurate results. We reviewed a range of different validated measures using EIF’s measurement review and a search of the literature, considering their relevance, validity and reliability, and practicalities. From our previous experience in Walsall, we have found that parents can struggle to complete longer measures. So we decided to use the Relationships Quality Index (RQI), which has six questions, and the shortened version of the Warwick-Edinburgh Mental Wellbeing Scales (SWEMWBS), which has seven questions.
In the second strand of the research, we wanted to look at how the course was being implemented through a process evaluation, which is primarily used to understand how an intervention is working and why. Specifically we wanted to explore the characteristics of the families taking part, how well the course was being implemented, and what parents thought the benefits of taking part were.
To explore these domains, we wrote 12 survey questions. We drew on EIF’s survey template as well as questionnaires used in previous early help evaluations, and the local diversity and monitoring form. We then shared this with colleagues, including parenting practitioners, for feedback on the wording of the questions and answer options.
To collect data, we combined the RQI and SWEMWBS measurement tools with the 12 additional survey questions into a single form. With help from our administration team, the form was set up online in Microsoft Forms, which was really easy to use. We felt this would be the most efficient way to collect data, particularly as the intervention was being delivered virtually.
The survey was administered at three time-points: during the first workshop, at the end of the last workshop, and again three to six months after completion of the course.
In total, 86 parents completed the pre-course questionnaire, and 42 parents completed the post-course questionnaire. We downloaded data collected from the online form into an Excel spreadsheet.
To explore whether the course had improved relationship and wellbeing outcomes for parents, we explored the data collected from the RQI and SWEMWBS using a statistical test. This test, called paired sample t-test, explores whether a change between pre/post scores is statistically significant, which means that it is not likely to have occurred by chance, but instead is likely to be because of the intervention.
To complete the t-test, we used an online t-test calculator, where we entered the pre/post scores. This was easy to use because it automatically calculated the statistics and provided a summary at the end of the page on whether the result was significant or not. We used appendix D of EIF’s 10 steps to evaluation success to interpret the results.
Results revealed that there were statistically significant improvements in parents’ perceptions of their relationship across all domains of relationship satisfaction. Results also revealed statistically significant improvements in some, but not all, domains of wellbeing. For the wellbeing questions where results were not statistically significant, we have hypothesised that this might be linked to the pandemic, as the questions relate to feeling positive about the future and close to others – both of which were difficult for anyone to answer in the context of global pandemic restrictions and lockdowns.
We then analysed the data collected from the 12 survey questions. We completed quantitative analysis on the closed response data and qualitative data analysis on the open text responses.
The findings showed that the right parents were being reached by the support, as most of those attending had a specific need that they wanted to address. Parents told us that they found attending the parenting course to be a positive experience. The data shows an overlap with what parents wanted to get out of the intervention in the pre-course survey and what they actually got out of the intervention in the post-course survey, suggesting the course is meeting the needs of parents. A majority of parents reported that their behaviour, knowledge of parental conflict and relationship with their child had improved following the intervention.
We wrote the results from the analysis into an evaluation report, which was presented to members of our parental conflict steering group. The report has since been sent to our senior management and divisional management team, and we are planning to publish it online, to increase the reach of our findings to a wider group of stakeholders.
What we achieved
We successfully completed a local evaluation of our online parenting course. We were able to demonstrate using robust methods that the intervention was making a positive difference to parents and children – the primary aim of our parenting team. The good sample size meant we had enough data to start making claims about the initial effectiveness of the intervention. This has provided support for the continued roll-out of the intervention and the use of online delivery for other types of parenting interventions.
Survey responses on how well the course was being implemented were monitored frequently, and it was clear that separated parents were initially feeding back that the intervention was not fully meeting their needs. These findings were used to shape and adapt the intervention, and for practitioners to reflect on delivery, to ensure it was covering issues relevant to this group of parents. The changes we made resulted in more positive feedback from parents.
While we recognise that adapting the intervention during delivery compromised the rigour of the evaluation, we felt that as it was a new intervention, it was important to tweak the approach to ensure it was meeting the needs of all parents.
Based on the ongoing feedback from separated parents, and the fact that a majority of parents accessing the course were separated, we have since gone on to develop a parenting intervention specifically designed for separated parenting. It will be interesting to compare the responses of the new separated parenting intervention with the responses from this evaluation, to see if the new approach is better meeting the needs of separated parents.
The data collected on the demographics of families attending the course has deepened our understanding of risk factors for parental conflict in Walsall. We were surprised to learn that three-quarters of parents attending the course had experienced at least one significant life event in the previous 12 months, such as job loss or having a new baby, which is data we had not previously collected. This has led us to think about how we could catch parents earlier in their change journeys, and we have since broadened early help assessments to include questions on significant life changes.
The evaluation project has increased awareness of the reducing parental conflict agenda. For instance, when domestic abuse services were recommissioned, there were conversations about how parental conflict fits in with the support offer. We hope that this relationship support will continue to get noticed across wider services, and that over time this will lead to more services in the areas of relationships, divorce and separation.
Lastly, the evaluation project has improved our confidence in conducting local evaluation. The project has helped to communicate a convincing purpose for evaluation, and demonstrate why evaluation in early help is so important. This has contributed to senior leaders agreeing to fund more evaluations of other early help interventions, such as play therapy for children who have witnessed domestic abuse and a conflict resolution intervention for young people. We hope this project will contribute to embedding the use of evidence into our parenting service as well as across early help more widely.
What worked well and what we would recommend
From the outset of the evaluation project, we had buy-in from senior strategic stakeholders, which helped to progress the evaluation.
During the initial planning stages it was helpful to develop an evaluation plan setting out a timeline of key evaluation tasks. This was signed off by senior stakeholders to ensure that we were given the capacity to deliver the evaluation, and we referred to the plan throughout the project to ensure we were on track.
In addition to the measures and survey questions that we did use, we would have preferred to use qualitative methods as well, such as interviews with parents who went on the course. This would have provided a more in-depth understanding of their experience, but limited resources, particularly in the context of the pandemic, meant this was not possible.
The fact that outcomes were sufficiently well specified in the parenting course theory of change and logic model made it easier to identify which outcomes should be measured in the evaluation. From previous experience, we have found some measures can step away from what is parents’ own language and therefore feel quite unfamiliar, or can be too long or complex, meaning they are not suitable for parents with limited literacy skills. It was therefore important to select outcome measures that were appropriate for parents attending the course.
Microsoft Forms was really easy to use and collecting data online meant that we could download our data straight into Excel for analysis. Running the survey during the intervention worked well because it provided an opportunity for practitioners to encourage parents to complete the survey and provide support if needed.
Because practitioners had been involved during the development of the theory of change and logic model, they understood how the evaluation data they were collecting fitted into the wider evaluation of the intervention, and were therefore motivated to help collect good-quality evaluation data. However, some parents were reluctant to answer questions about their relationship, which we felt was related to the fact parents had not been asked these sorts of questions before. Practitioners noted that some parents who were not comfortable answering these questions in the first session were more open to answering the questions at the end of the intervention once they had attended sessions and built trusting relationships with practitioners.
This was the fourth and final phase of our reducing parental conflict evaluation project, which has been covered in this series of case studies:
- Map the local workforce
- Write local system-level reducing parental conflict theory of change, setting out key outcomes
- a) Decide which interventions to evaluate; b) Write intervention theory of change and logic model
- a) Write evaluation plan specifying research questions and methods; b) Plan, collect, analyse and interpret evaluation data; c) Use of evaluation data.
We plan to continue to deliver and evaluate the online parenting course using the same evaluation methodology, and will use the dataset to provide a point of comparison as we go forwards with our developing parenting and relationship offer. By doing so, we are aiming to support more families, further refine the intervention, and build the evidence base on effective interventions to improve parental relationships.
For more information on how to conduct a pilot and process evaluation, see module three of EIF’s reducing parental conflict evaluation guide for local areas.