Dorset: Developing and piloting a reducing parental conflict child’s voice measure
This case example is part of EIF’s ongoing work to showcase how local areas are introducing change, adapting their strategies and changing the way they work to reduce parental conflict and improve outcomes for children.
This is Dorset’s story about developing and piloting a child’s voice measure to establish if the use of a practitioner toolkit reduced the impact of parental conflict on children. It is told by Jenny Lyons, referral gateway coordinator for the Dorset contract package area, and Helen Burridge, research officer at EIF.
Our starting point
The Dorset contract package area is a group of local authorities along the south coast of England who are working together as part of the Reducing Parental Conflict Programme to evaluate specialist interventions for parents who are in conflict. The Dorset contract package area brought together seven local authorities with some common features including rurality and social isolation, or a seasonal economy reliant on tourism. The impact of isolation and financial insecurity can put substantial strain on interparental relationships.
The local partners had developed a practitioner toolkit to help practitioners confidently recognise the impact of parental conflict and support parents experiencing conflict in their relationship. The toolkit includes resources to support effective practice, and tools to help parents experiencing parental conflict. We wanted to find out whether the support and resources provided to parents were improving outcomes for children and in doing so, give children the opportunity to voice their own views about how they perceive parental conflict and the impact it has on them. We set out to develop, implement and pilot a measure completed with children, to capture the voice of the child and their lived experiences before and after their parents receive support.
The action we took
We decided to use a pre/post study design to track changes over time and better understand whether the use of a practitioner toolkit resulted in a decreased impact of parental conflict on children. Since this was the first time we were developing a child voice measure, we conducted a pilot study, which is a smaller study usually conducted before a larger-scale study.
We wanted to use valid and reliable measurement tools, as it means they have been carefully tested by researchers to make sure they produce accurate results. Supported by EIF and our DWP regional integration lead, we identified possible measures to include using EIF’s measurement review and a brief search of the literature online. As we wanted to develop a tool that was suitable for practice, it was important to select a measure that was relatively short and suitable for use with most children and young people with support from a practitioner. We decided to use a shortened version of the Children’s Perception of Interparental Conflict (CPIC) as we thought the full length CPIC measure would be too burdensome. We felt that the measure did not comprehensively capture how children might be feeling in response to parental conflict, so we decided to also use a shortened version of the Security in the Interparental Subsystem (SIS) scale, which explores the child’s emotions in responding to parental conflict.
After we had selected our measurement tools, we conducted an online focus group with five practitioners to gather their views on the child’s voice measure. We discussed the clarity of questions, logic and flow, acceptability of topics and length. We also discussed the information that parents may require to understand and recognise the purpose of the child voice measure and the wider toolkit for their family. Practitioners were asked to consider how the measure fitted within their current practice, what they anticipated the challenges would be to embedding use into practice, and how these could be overcome. This helped to inform the introductory workshop and accompanying step-by-step guide to ensure the measure was pitched at the right level and addressed anticipated challenges.
Once we had finalised our measurement tool, we considered data protection arrangements in collaboration with our data sharing team. With help from our administration team, the survey was set up as an online form in Microsoft Forms which was really easy to use.
I enlisted early help team managers to recruit family support workers to the pilot. They attended an online introductory workshop which introduced the child voice measure, explained how to use it, and outlined their responsibilities for data collection during the pilot. The workshop also provided an opportunity to outline the different sources of support and key messages within the toolkit, for example a focus on how parents communicate and constructive communication styles.
After the workshop, the practitioners identified families who could benefit from the child voice measure and toolkit. They introduced each family to the evaluation during a regular meeting that could be face-to-face or virtual due to the pilot taking place during varying stages of Covid-19 lockdown restrictions. Once a family indicated they wanted to take part, the family support worker introduced the child’s voice pre-service measure to all the children who were aged between nine and 17 years old and supported them to complete it. The follow-up child voice measure was then completed with the children after their parents accessed signposted support or received direct support from the family support worker. In total, the pre-support child’s voice measure was used with seven families and three of those also completed the post-support measure. The tool was used by additional families, but they did not provide consent for their data to be used in the research.
The family support workers attended reflective sessions every six to eight weeks during the four-month pilot, which were used to share good practice and discuss barriers. In some cases, the complex issues the families were experiencing made it difficult for the workers to use the full toolkit. The practitioners also reported high levels of domestic abuse, including coercive control, within their caseloads and this meant the toolkit was not relevant. The reflective sessions helped to keep the focus on the pilot, to gather learning throughout the four-month period and to help family support workers identify how to use the toolkit and when the toolkit could be suitable for a family.
We developed a pre/post staff survey to collect feedback from family support practitioners, including questions on the reasons for using the child’s voice measure and views on the usability of the toolkit. As with the child’s voice tool, we administered the staff survey online on Microsoft Forms. We collected data from practitioners both before and after families had received support.
The data from the pre/post-support child voice measure, staff feedback survey and a post-support parent survey were gathered within an Excel spreadsheet for analysis. It was difficult to draw conclusions due to the limited data, but some interesting findings have started to emerge. Most children found the child voice measure easy, somewhat easy, or neither easy nor hard to complete. The data showed some reduction in the impact of parental conflict in most situations. In one case, the scores increased, showing a greater awareness or a greater ease in responding to the measure.
What we achieved
We developed a practical tool for practitioners to use to gather children’s perceptions of parental conflict. So far, family support workers have provided positive feedback on the usability and usefulness of the child’s voice measure. Involving practitioners in the process of developing the child’s voice measure and gathering their feedback during the focus group and reflective practice sessions helped secure their investment in the pilot.
We have given children the opportunity to describe their views and feelings about parental conflict which ensures children feel listened to. The data has helped to expand the understanding of the impact of parental conflict on children and the environment which they live in, and highlighted the impact of the parental relationship even if one parent is no longer present and therefore not participating in support. Practitioners have used results from the child voice measure to provide more informed support. The other feedback we are still receiving from practitioners is that the tool has really helped communicate the impact of parental conflict on children to parents and this has led to a motivation for change.
It was felt that we need ‘buy-in’ to promote widescale use of the toolkit and to achieve this we need it to become part of our practice with most families to increase familiarity and use. As a result we revised the toolkit, turning it into a family communication toolkit and child voice tool, which we introduced to all family intervention workers in Devon, one of the local areas in the contract package area, in November 2021.
What worked well and what we would recommend
With EIF’s support we were able to develop our idea from an initial proposal to a pilot. Developing an evaluation plan from the outset was useful as it helped clarify the methods that were going to be used in the evaluation, and we updated the plan throughout the evaluation. The plan helped to keep up momentum, which was particularly important considering the tight timeline and the impact Covid-19 had on our working practices and demands on our services.
It was useful to spend time during the introductory workshop to explain the aims of the evaluation to practitioners and why we needed their buy-in and commitment.
Completing a data protection checklist in collaboration with our data sharing team allowed us to minimise the amount of personal data we were collecting. We identified that rather than collecting identifiable information such as names of children to link pre/post responses, we could link responses by a unique ID number instead. We also minimised the amount of personal data by reducing the precision of some variables, for instance by asking participants for their age group rather than date of birth.
Families who were recruited into the evaluation by a practitioner they already knew and trusted were more likely to agree to take part and felt more confident asking questions. Care was taken to ensure all parents and children were informed about evaluation activities and the use of their personal data by providing them with a verbal description of the research project accompanied by a simple information sheet. This reinforced the message that children and families did not have to participate and could withdraw from the evaluation at any time.
The existing relationship the family support worker had with families meant they could tailor their communication style to meet the needs of individual children. This enabled children to feel more comfortable answering sensitive questions. The family support worker was also in a better position to respond if the data highlighted any safeguarding concerns, or other issues that came up.
Microsoft Forms was really easy to use and collecting data online meant that we were able to download our data straight into Excel for analysis. Administering the measure in person worked well because it provided an opportunity for practitioners to assist children to complete the measure and answer any questions children or parents had about the pilot. We have been upskilled in using Microsoft Forms and have since used the software in other pieces of work to gather feedback and conduct evaluations.
We experienced some difficulties in getting the balance right between evaluation and practice. Because families have multiple and varied needs, meaning their support requirements differ, it was difficult for us to embed a consistent approach to the pre/post measure. We sent prompts to practitioners to collect post-measure data after six weeks, but this meant it wasn’t true end-line data because some families continued to receive support. In cases where only the pre-support measurement tools were used, this is because families either disengaged or were now receiving social care services.
Working with four local areas meant we were able to pilot the child’s voice tool with a diverse group of practitioners and a higher number of families. However, it was difficult to ensure that the work was being completed at the same pace across local areas, partly because we did not have the same level of senior buy-in, and there were specific local challenges such as service restructures and staff changes. We tried to recruit teams from other areas but this was unsuccessful. We also experienced challenges of not engaging enough teams initially and teams dropping out of the pilot. This was mainly due to Covid-19 impacting staffing levels, increasing demands on services, and shifting priorities. This could have been overcome by increasing the length of the pilot, or by providing an introductory workshop to a higher number of workers with an expectation that some would not engage with the pilot.
We plan to continue to evaluate our practitioner toolkits using the same methodology and to use the learning from the pilot to inform workforce development. This work is helping us support more families, and build the evidence base around children’s experiences of parental conflict and effective interventions to improve parental relationships and child outcomes.
Learning will also be used to develop a child toolkit that can be used by the wider children and families workforce. The child voice measure has been adapted to be a card-based activity or a quiz format to help expand its usability for children at different developmental stages.
For more information on how to conduct a pilot evaluation, see module three of EIF’s reducing parental conflict evaluation guide for local areas.