Skip to content

Investing to learn: a year in YEF’s evaluation team

The work we do at the Youth Endowment Fund on evaluation and research is central to achieving our mission – to keep children and young people from becoming involved in violence.

It’s true that we’re funder; since 2019, we’ve committed a huge amount of money – £56 million – to some incredible grantees. The approaches we’ve invested in are truly varied, and include training for youth workers to deliver high-intensity cognitive behavioural therapy to children in London, group work for parents and children affected by domestic abuse and intensive support for children outside of mainstream education. While this work is undoubtedly important, short-term grants aren’t enough to help us find long-term solutions to a complex problem. To make a sustainable change that lasts far into the future, we need to find out what actually works to keep children and young people safe.

That’s where my team comes in. We make sure that every single one of funded projects is paired with an independent evaluator, who tests and researches whether the programme, practice or policy makes a difference to the children who take part. As a What Works Centre, it’s YEF’s job to get this information into the hands of people in power – like Police and Crime Commissioners, central government departments, Directors of Children’s Services or school leaders – so that they’re taking decisions that give them the best possible chance of keeping young people safe.  

Why do we need to find out what works?

Often, people ask why we’re investing so much in research, when “really we know what works.” The problem is that, at the moment, that just isn’t the case. It’s true that there is some really promising early evidence that some approaches could really make a difference – but often research has been conducted in other places (like the United States) or just isn’t rigorous enough for us to be truly confident that it actually prevents violence.

Good ideas just aren’t enough. We know that, when high-quality trials are conducted, up to 80% of programmes are found not to make any lasting difference. And then there are those that even cause harm to the children they’re meant to support. We hope that this won’t be the case for the projects we evaluate (because of the amount of research we do before we decide to invest in a particular kind of programme), but it is possible that – despite our best intentions – a programme actually increases the likelihood of a child getting involved in crime and violence. We’ve seen it before with projects like prison awareness schemes or bootcamps, which have been shown to be harmful for children and young people. Evaluation is an incredibly important tool in making sure that we’re never unwittingly causing harm to the children most vulnerable to violence.

All of this means that, if we want to end violence by providing children and their families with the best possible support, we need to learn what works, for whom, when and how. By investing in answering these crucial questions, we can compellingly advocate for change. That might mean using evidence to call for long-term, sustainable funding for effective services, for new policies to be implemented or practice to change in education, policing, criminal justice or social care.

Generating this kind of research does take time. It’s taken us until this year to publish the results of some the first of our evaluations and the pandemic threw up some considerable challenges for the programmes we funded through our first grant round. But it’s a good start to our learning – and in the coming years, we’ll have the results for complicated research projects that can tell us about the effectiveness of projects that train youth workers to provide mental health support, offer tailored services to children at risk of exclusion or engaging young people with other services after they’ve been arrested. Learning from these projects will give us a stronger understand what works – and the ability to truly champion those life-changing approaches.

Research is clearly important. But what counts as ‘robust’ or ‘high quality’ evidence?   

At the YEF, we believe that, to give us the most rounded, complete picture of what works to prevent violence, we need to answer a lot of different questions – and that requires using different types of research to get to the answers. To answer questions about the realities of young people’s lives, we’ve invested peer research (led by and for young people) and commissioned surveys through our Children, violence and vulnerability report. To answer questions about the way policy or practice drives or prevents violence, we’re commissioning secondary data analysis. And to answer questions about what works, we’re investing in tools like randomised control trials, because they’re the best way to show if a particular programme makes a difference.

We know that not every project will be ready for a robust evaluation – and that’s OK. We always make sure that the evaluations we fund fit the project, with the aim of working up to a research design that can help us understand the difference it makes for young people. Drawing on the Early Intervention Foundation’s Ten steps to evaluation success, this means that we might fund:

  • Feasibility studies. These are commissioned when an intervention is in a very early stage of development and could benefit from further development. Feasibility studies may also be commissioned when a relatively well-specified intervention is being adopted from another context (e.g. abroad), to see if it might work in a local context.
  • Pilot studies. We use these where an intervention is relatively well-defined, but where it is not clear if it’s possible to run a larger-scale research project. In these cases, it’s helpful to test research instruments (like questionnaires, which are used to gather data) and potential impact evaluation designs and methods (like the  process for creating randomised ‘control’ groups that wouldn’t receive the intervention in a later study).
  • Efficacy trials. These are our first type of randomised controlled trials, which we use when all project materials, resources and processes have been fully developed. One group of young people will receive an intervention, and another will not. We will then assess the impact of the intervention on the group who received it. At an efficacy trial level, the programme is delivered under ideal conditions. 
  • Effectiveness trials. These are larger trials that aim to test whether an intervention can work at a larger scale, under ‘real world’ conditions . These evaluations often try and gather more information about who evaluation are effective for (which is sometimes called subgroup analysis) and might look at outcomes in the longer term, long after the project has finished.

Irrespective of the type of trial we’re funding, we’ll always stick to a clear set of principles. We’ve put these in place to make sure that what we’re learning is as useful as possible:

  1. We’re as rigorous as we can be while making sure that delivery is high quality. This means we do everything we can to make sure that an approach is delivered consistently across different places, so that we can draw reliable conclusions about the difference it makes to children.
  2. We’re focused on insights relating to children’s involvement  in violence. There are lots of different things we could choose to measure when we run an evaluation. For example, does a programme improve a child’s grades at school? Does it help them learn positive behaviours, like sharing? Does it reduce bullying in a school? Does it lead to fewer criminal offences being committed? Through our Outcomes Framework, we’re determined to make sure that everything we’re learning links back to our mission – to keep children safe from violence.
  3. Provide value for the programme. It’s a lot of effort to take part in an evaluation, so we need to make sure that our research is useful to the projects we fund, helping them to adapt their programme or apply for longer-term funds.
  4. Help us follow-up the impact over time. All of our research will be safely stored in our evaluation data archive, so that we can see the difference that a project makes months, years and even decades into the future.  

What’s next for YEF evaluation?

We’re now at a stage where there are lots of projects delivering. We’re funding diversion schemes, that offer children alternatives to arrest and custody, family support programmes and projects that look to foster relationships between children and a trusted adult. We’re investigating partnership approaches to policing through focused deterrence and how we can fund smaller organisations to take part in large-scale research through multi-site trials.  

In 2023, we’ll publish more than twenty new evaluation reports, showing you what we’ve found out. Over the lifetime of the Fund, we’ll have built a bank of hundreds of evaluations , giving us a strong understanding of how we can reduce violence and keep people safe. But if we want to reduce violence, we need to make sure that our findings are actually being used – by people who set policy, decide on funding, commission services and work with children on the frontline. It’s only by working with our partners – in children’s services, policing, schools, youth work and government – that we can put what we learn into practice. Together, we’ll make change for children and young people.  

Related content