Skip navigation
Blog

Evidence and innovation: How does the hackathon approach find new solutions to social problems?

Published

19 Jul 2018

Lucy Brims reflects on the hackathon approach to generating new ideas to support children and young people, and how innovation and evidence can work together to ensure new interventions stand the best chance of being effective.

I recently went to my first hackathon. I had never been to one, but felt a little nervous of an event that combined the words ‘hack’ and ‘marathon’. For those who have heard of the term, it may conjure up images of tech folk engaged in 24-hour software development sprints, powered by pizza and energy drinks.

Mine was not quite like that. This hackathon, organised by London Ventures, brought together around 60 people who worked across sectors in the children and families sphere. The aim was to develop innovative solutions to key challenges facing children and young people. For a researcher from a steadfastly evidence-led charity like EIF, it was a fascinating opportunity to consider where good ideas for early intervention come from, and the various ways in which new interventions are developed.

Our day started with the organisers presenting an overview of relevant challenges, facts and figures, on topics ranging from gang violence to the high numbers of children entering local authority care. Next, participants were divided into competing groups. With colourful post-it notes and flip charts a plenty, we worked together to share ideas, knowledge and experience in a feverish attempt to develop an innovative solution to our particular challenge: that we are not making the most of local community resources for children and families, which means local authorities are responding to high numbers of referrals at all threshold levels.

All the groups then presented their best ideas in a three-minute pitch to a panel of judges. The whole thing was over in three and a half hours. Winners were declared on the day – unfortunately, without the $1 million prize that was awarded at one tech hackathon. After the event, all the various solutions were taken away for further refining and development. Some of these will be piloted and ultimately launched.

It was impossible not to contrast this approach with the more evidence-led approach of an organisation like EIF, which focuses on learning from what has and hasn’t worked in the past. Our natural preference would probably be to immerse ourselves in the evidence for several months before any meaningful development occurred. If you are interested in preventing substance misuse in young people, say, we might start by asking what sort of interventions have been successful in doing so in the past. What age groups have they targeted? What qualifications do the practitioners have? What kind of knowledge and skills have been cultivated? We start with these kinds of questions because we know that such characteristics matter to an intervention’s effectiveness. Like the hackathon judges, we would also consider other factors, such as how easy a new intervention would be to implement, and its likely cost-effectiveness.

So the hackathon approach raised some interesting questions for me.

What is the relationship between innovation and evidence?

There seems to be a tension between ‘starting with the scientific evidence’ and the approach promoted at the hackathon, based on the principles of ‘thinking outside the box’ and ‘no idea is a stupid idea’. The hackathon provided an opportunity to exchange relevant knowledge and experience with a range of people, spark ideas and discover synergies. Does leading with the evidence undermine this? And what if there is no evidence relating to a new problem or new solution – how would existing evidence inform you then?

My reflection is that both innovation and evidence are valuable, and they are not necessarily opposing. For example, components of existing interventions can act as a platform for innovation to address a new problem or target group. Even if no relevant interventions exist, knowledge of child development, risk factors and protective factors can help to ensure that innovations are underpinned by a clear logic.

When should evidence be injected into the process?

Perhaps an evidence review is not needed as the very first step. Brainstorming and drawing on practice experience helps to develop new ideas. Nonetheless, engaging with relevant evidence at some point gives interventions the best chance of success. And when evidence is limited, the evaluation of new interventions or services is especially important. There is an obvious opportunity for this to happen during the development process that follows a hackathon like this, when selected ideas are being piloted and rolled out.

Evaluation also contributes to the evidence base for the future. Some of the most intuitive and seemingly logical ideas for supporting young people can turn out not to work or even to be harmful. For example, in the Cambridge-Somerville study, young people who received support from a community worker, including to address family problems and access professional support, were found to be worse off than the control group as adults, including having worse physical health and more arrests. Formal, rigorous evaluation has a key role in uncovering these kinds of unintuitive or unexpected outcomes.

Who decides what challenges are worth addressing and what an intervention should look like?

It is notable that many evidence-based interventions are developed by university academics. But it is important to include frontline practitioners and managers, as well as service users themselves, in the process of intervention development. The James Lind Alliance, for example, works in the health sphere to bring together patients, carers and clinicians to define research priorities, promoting voices that are less commonly heard. The hackathon was a good example of bringing together people in a range of roles from the public, private and third sectors to focus their diverse perspectives on a shared challenge.

Overall, I enjoyed the experience and heard ideas that I would not have come across through reviewing journal articles. Our discussions were focused on immediate problems faced by young people in London. Clearly it remains crucial that new interventions are evaluated as they are piloted and rolled out – just because an idea is innovative and intuitively appealing doesn’t mean it will be effective when it’s tested in the real world. In the end, however, evidence and innovation are complementary, not mutually exclusive, and when new solutions are needed, drawing on both is optimal.