Why what doesn’t work in the classroom is as important as what does

FE continues to have trouble with working out what works – but evidence of absence is not the same as absence of evidence

FE continues to have trouble with working out what works – but evidence of absence is not the same as absence of evidence

30 Jun 2024, 5:00

“Just give it a try,” was the advice I received when I started teaching. I wanted to innovate, as though I were the first teacher on earth, and those words gave me licence to do my own thing. On reflection now, I see they put ego before students.

One of the most positive things to happen in teaching in the decades since is the hugely-increased reverence for research and the desire to be informed by ‘what works’. In further education, you can scarcely find a teacher who hasn’t been involved in some kind of practitioner research.

Yet we still seem to be in the dark when it comes to what has been proven to work when replicated and scaled up.

A good example of this is 16-19 English and maths. They’re FE’s two biggest qualifications by far. They are vital for progression in education or employment. And they have been a focus of colleges’ Herculean efforts for almost a decade now. Yet where things are piloted or trialled, we don’t seem to have much luck.

Using text messages to nudge learners? No impact.

Contextualisation? Apparently difficult to assess “whether the intervention had an impact”.

The evaluation of ‘Assess for Success’ sounded more like my school report card: “some evidence of promise”. However, there are no plans to trial it further.

And a year on, we’re still waiting to hear about the impact of ‘5Rs’.

Part of the problem is applying school methodologies to colleges that can have over ten thousand students and turnovers in nine figures. The above studies hint at that, with comments such as needing “more clarity about the expectations of participating staff” or “less than half of teachers attended both of the first two training sessions”.

College English and maths teams are commonly bigger than the staffs of entire schools, but they receive fewer contracted hours. And when things get busy, dropping a burden that comes without sufficient incentive is the obvious choice.

We must make informed choices to ‘adopt/iterate/avoid’

I know from managing the DfE’s Centres for Excellence in Maths (CfEM) project how tricky it is to retain participants. For instance, it was a sad irony to see a large ‘Engagement & Resilience’ project scuppered by a 79% student drop-out rate.

It’s also daft to try anything other than recruiting whole teams within colleges. Lone teachers, or even pairs, can rarely swim against the tide for the duration of a trial. Besides, it requires tracking individual students’ outcomes rather than provider-level data, which is much more onerous.

There are some green shoots though. In addition to the groundbreaking CfEM mastery trial that led to improved GCSE maths scores, there were enormously encouraging results from college-led projects on bar modelling and ratio tables.

On the oft-neglected English side of things, my own creative-writing resits intervention, funded by SHINE, retained almost 500 student participants across three colleges. It not only demonstrated improved confidence in the high-tariff GCSE writing tasks, but also clear mental health benefits.

It’s also important to acknowledge that finding out something doesn’t work is valuable. I’ve seen the zero-impact ‘Lesson Study’ approach raising its head again recently, alongside long-debunked ‘discovery learning’.

It’s our duty as professionals to learn from what’s already been tested, so that we can make an informed choice to ‘adopt/iterate/avoid’.

The best challenge I ever heard on CfEM when I inherited that project was: “How do we know it wouldn’t have had more impact to just give the students the £250?” (based on the £30 million cost divided by the c.120,000 student reach).

My answer (which I thought of too late) is that the £30 million wasn’t just for those students participating between 2018 and 2023. It means we know – forever – that mastery teaching will improve maths exam scores.

Every teacher who takes that into the classroom, year on year, brings down that per-student cost while improving outcomes for learners.

So when an enthusiastic new teacher asks for your advice on trying something out, don’t send them out to ‘mess around and find out’. Instead, ask them: “What does the evidence say?”

Latest education roles from

Chief Operating Officer

Chief Operating Officer

Leo Academy Trust

Chief Financial Officer – New College Swindon

Chief Financial Officer – New College Swindon

FEA

Finance Manager – Waltham Forest College

Finance Manager – Waltham Forest College

FEA

Director of Music

Director of Music

Blenheim High School

Sponsored posts

Sponsored post

Reducing resits and evidencing progress: a new approach to maths and English delivery

Across further education and apprenticeships, English and maths remain central to learner progression, employability and long-term opportunity.

Advertorial
Sponsored post

From Classroom to Catalyst: How Apprentices Are Driving Innovation in the Workplace

The economy is increasingly shaped by productivity challenges, skills reform and the urgent need for innovation led growth.

Advertorial
Sponsored post

What you missed in the post-16 consultation response

With the publication of the government’s response to the post-16 skills pathway consultation, there’s been lots of media outlets...

Advertorial
Sponsored post

Apprenticeship reform: An opportunity to future‑proof skills and unlock career pathways

The apprenticeship landscape is undergoing one of its most significant transformations in decades, and that’s good news for learners,...

Advertorial

More from this theme

Awarding, Teaching, Young people

Student AI confessions prompted rethink, says Bauckham

Ofqual to assess awarding orgs' AI cheating policies while chief commits to 'no easy' V Levels

Shane Chowen
Skills reform, Teaching

AI Skills Hub risks ‘copy and paste of past failure’

New AI skills hub initiative reeks of pandemic-era 'skills toolkits' failures

Anviksha Patel
Ofsted, Teaching

Ofsted reveals how it will inspect providers’ AI use

Inspectors will not check tech use as a ‘standalone’ part of inspections, but will look at its impact on...

Jack Dyson
Colleges, Teaching

AI guidance for colleges: 9 key findings for leaders

Government toolkits say colleges should train staff on safe AI use and to spot deep-fakes

Jack Dyson

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *