The U-turn on randomised control trials is bad news for FE

21 Jun 2021, 6:00

Randomised control trials are the answer to FE’s biggest questions, writes Ben Gadsby

I have a confession: My name is Ben and I am a fan of randomised control trials (RCTs).

Take two similar groups of people, give one a skills bootcamp, one not-a-skills-bootcamp, and then compare the outcomes. It’s hardly life or death – unlike if you replace “skills bootcamp” with “vaccine” in that sentence…

Last week’s paper seemed to somewhat disagree, and now the government has U-turned. But this is not good news ̶ FE Week readers should be fans of RCTs too. 

Do skills bootcamps work? This is a simple but vital question. No one knows. Skills bootcamps might be the sector’s finest invention ̶ or a bigger waste of time than trainspotting. An RCT is the most effective way to answer the question. 

If you’re curious about why an RCT is the best way to answer “does it work” questions, I recommend the blogs of the late American psychologist Robert Slavin. Suffice to say if you really want to know if something works then randomness is a necessary feature, not a bug ̶ as is making sure your comparison group doesn’t end up getting the benefits (or otherwise) of what you’re testing.

But much more important than research methodology is this: if we want the FE sector to be the best that it can be, we need more research, not less. It’s good for the sector and it’s good for the people it serves.  

It’s good for the sector because it’s how we address the longstanding underfunding in comparison to schools. We need to convince the government in the next spending review to invest more money in our colleges.

But the only way to do that is with evidence that the money will lead to better outcomes – better skills, better pay, less worklessness, etc. The Treasury will be willing to invest in things that change lives, not things that don’t change lives.

And these outcomes are the outcomes we all want for students too. How much money is currently wasted on projects and schemes that have zero impact on anything that matters? Probably a billion.

Not only do we not know that it’s not having an impact (because no one is investigating), but even if we did, we don’t know what we should do instead (because no one is investigating).

A comparison with schools is illuminating. Our sister charity, the Education Endowment Foundation, has conducted numerous RCTs and found that most of the initiatives they trial don’t have an impact on attainment. These are things schools should probably think more carefully about spending money on.

But the initiatives that do have an impact on attainment tend to attract money – big money. The government is currently retendering for a school breakfasts programme, because Magic Breakfast proved in an RCT that it works.

Initiatives that have an impact on attainment tend attract big money

The government is throwing money at tutoring for millions of pupils – because the Tutor Trust proved in an RCT that it works. Impetus has funded and supported both charities in recent years.

Ironically, that RCT for the Tutor Trust is actually the reason colleges have access to the 16-to-19 tuition fund at all. But not because an FE tutoring model got an RCT. The sector is fortunate to have providers building on the schools evidence base, such as Get Further, which offers an impressive tutoring programme built on similar principles to the Tutor Trust model but tailored to the needs of FE learners.

No, it is concerning that colleges didn’t get a chunk of money because they made an evidence-based case to fund provision. Instead, they only got the money because the sector would have otherwise complained. That shouldn’t be the basis on which FE gets money. It’s a patronising pat on the head: run along and stop moaning.

Incidentally, that’s the same instinct that has probably led to the U-turn. I wish the government were now making the case for RCTs in FE but instead you can almost hear the sigh from the DfE: fine, we won’t do it. Now run along and stop moaning.

FE is failing to get the investment it needs because we don’t have proven programmes to make the case.

As a sector, we should aspire to prove our worth, and demand the investment that is merited.

The key word in that sentence is prove. We need evidence.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment

  1. I have not been involved but there are two issues to the FE sectors complaints about RCT, one practical and one moral. The practical, and perhaps most important one is that funding in FE, in comparison with HE and schools is mean. RCTs are fine so long as providers are fairly compensated for the effort put in to recruitment (and it is a lot of of effort) and so long as the overall numbers on programme are not lower than the contracted numbers ie RCT doesn’t impact the numbers for which FE bis being paid full contractual value.
    The moral question is simply that if the programme does no harm and may do some good then depriving learners of the opportunity does disadvantage them. ie it is different from a pharma trial which may negatively impact the health of the patient.
    On the other hand you are right, if it can be proven that it works then it justifies more money. The answer is therefore put more thought into it, fairly compensate the contacted providers so they are not disadvantaged and Bob’s your uncle. It just needs a little thought and planning.