The chief inspector is right: success rates are not the best way to measure FE. So why doesn’t he come up with an alternative instead of beating the sector with the same old stick, says Jayne Stigger

Inspections under the new Common Inspection Framework (CIF) although rightly focused on teaching and learning, still quote success as the main judgment. Reading through Ofsted reports released this month, I found in 14 pages that they mention, on average, “success rates” 18 times, “retention” and “outcomes” both four times, and “achievement” six times.

If it is, as chief inspector Sir Michael Wilshaw said last week, “palpable nonsense” to measure FE by success rates, (and I don’t disagree) why does Ofsted continue to use it as its first judgment? Measures of education need to reflect more accurately the comprehensive mission of colleges and the diverse student population they serve.

So, what about retention? No, just keeping a learner isn’t a good enough measure of what we do. Instead of twiddling with data types, let’s make the system work.

The problem is with the system of measurement. We acknowledge that one size of education doesn’t suit all learners, so why should we expect one size of judgment to suit all colleges?

There are 219 general FE, 94 sixth-form, 15 landbased, three art and design and 10 specialist colleges in England, all with different learners, different objectives and different outcomes. Measure us differently. If we continue to judge a fish by its ability to climb a tree, we are condemning excellent teaching and learning to years of failure.

FE colleges are run as a business, so why not measure them that way in a way that is relevant to their aims — customer satisfaction; growing customer base; stakeholder satisfaction; employee satisfaction; and, finances.

The problem is with the system of measurement”

Put this into the annual self-assessment report to Ofsted; coupled with external quality reviews by peer colleges on teaching and learning. The college grade would encompass both reports and would be timely, relevant and more reflective of the true state of FE.  Any reports ringing alarm bells could warrant a visit by the new FE commissioner and his team.

You cannot measure academic, vocational, enterprise, entrepreneurship, apprentices, training, adult and foundation learning with the same stick, something that the current CIF tries to do.

Why not a financial incentive for every positive destination? As Doug Richard reports: “This can be most elegantly ensured by making sure that the funding of the system focuses everyone in the correct direction. In that spirit, I also recommend a redirection of funding.”

Success rates, driven by funding incentives, have played a large part in the growth of the number of qualifications and increased course success rates, but FE now works in a complex financial landscape, forced to make choices that may adversely affect students. How does this serve the poor, disadvantaged learners? Good education deserves good funding.

Sir Michael talks of dismantling “too large” colleges; are they, rather than their success rates, the target? A number of large colleges have been downgraded yet the latest advert for an Ofsted inspector says: “You might be the vice-principal or member of the senior leadership team of a large college.” A large college? So, you can’t run a college but you can inspect them?

Size isn’t the issue, it is management
and governance. Some principals have hung on to their role for years, whilst failing to improve; did Ofsted recommend they were removed? No, they left them in post. If 67 per cent  are good or outstanding and 4 per cent are inadequate, then the commissioner will have to deal with fewer than two or three a year. Could Ofsted, with learners and stakeholders, make recommendations to be reviewed by an independent commissioner instead?

Large or small we are focused on our learners; no matter how it is measured.

We’ve played by the rules of the organisation that judges us. But before you berate FE colleges further, Sir Michael, raise your own game by looking at more varied and reliable evidence.

Jayne Stigger, excellence and innovations manager, Basingtoke College of Technology