Graham Taylor explains why he was less than impressed with the recent qualification and achievement report.

Let’s move on from my last article on apprenticeships — the consensus feedback   to that was ‘kick the reforms into the long grass’ as we believe we can meet Dave’s target without the unnecessary, complicated and costly wiring of the proposed changes.

I would like to focus now on the fiasco that is the qualification and achievement report (QAR) received after an interminable delay on April 5, and well documented in FE Week.

It’s full of unnecessary terminology changes.

Success rates (SR) no longer exist. They are now achievement rates (AR) and the old achievement rates are now called pass rates (PR).

I encourage all MIS managers to feedback to the powers that be

Search me why they needed to change the system and terminology — can someone explain please?

Another key difference is that we no longer have an overall college AR, only 16-18 and 19+ breakdowns.

While we are able to calculate our overall figure, the QAR doesn’t contain the data that would allow us to work out the national overall AR.

For some reason, they’ve chosen to omit the national cohort figures from the data.

As a key quality measure, we need to assess success rates (old terminology) at course level and build to department/ sector skills area and college, and compare with national averages for what we do. By the way, using weighted averages at SSA level is a concept that some Ofsted inspectors we know and love struggle with.

They are the best objective measures of quality available. We await the national averages file so that it can be imported.

The report as a whole was littered with mistakes and didn’t show you what the old one did — which also had better terminology.

You need national rates at course level upwards to make meaningful quality judgments. We await them with bated breath.

And even after the long delay in publishing the data, the dashboard is slow, unreliable, lacks key information and is set up in a way that will cause further delays in producing information that was previously readily available.

For example, the in-built function to export and produce hard copy is time-consuming and produces poorly-formatted, often unusable PDFs.

We’ll have to resort to screen printing for this.

I encourage all MIS managers to feedback to the powers that be.

We use ProAchieve (other systems are available) and our view of the latest ProAchieve update is that it will become the ‘go to’ source for data.

The interface is much improved on the QAR and will be easily accessible by all staff.

How can informed decisions on quality be made both internally and by Ofsted when the national averages were almost two years out of date?

How could any college in this year’s Oftsed round (61 and counting) be reasonably assessed without 2014/15 benchmarks?

No wonder reports are bland. Here’s one comment: “This college’s performance is in line with the rates for colleges nationally.” That must be referring back to 2013/14 presumably?

Reports used to be informative and give ideas on how to improve. Not now.

But we have the headline success rates for apprenticeships, at 71.7 per cent overall; 79.8 per cent for 16 to 18-year-olds; and 87 per cent for 19+ (adult qualifications too easy Mr Wilshaw?).

In absolute terms, apprenticeships outcomes look low — not helped by stretching course lengths going back to (former Skills Minister) John Hayes’ 12 and 18 month rule, and the concomitant increase in drop-out rates and higher labour turnover in a dynamic jobs market.

Overall GCSE English A* to C success rates for 16-18-year-olds and 19+ learners were 31.1 per cent and 50.2 per cent.

The figure stood at just 27.8 per cent and 52.3 per cent respectively for maths.

It is arguably a minor miracle that about 30 per cent of youngsters get through this in one year, after years of struggling at school.

Well done everyone. Keep fighting the good fight. We’re not finished yet.