Colleges need to insist they are given the right data to make meaningful quality judgements about their own performance, says Graham Taylor.

Last April I wrote about the bother over the production of the 2014/15 Qualification & Achievement Report (QAR), received after an interminable delay on April 5 and well documented in FE Week.

An unnecessary change, it was littered with mistakes and lacked essential elements of the old QAR system – which also had better terminology. Information system managers up and down the country expressed their dissatisfaction publicly.

It was expected that the 2017 update would address these concerns, yet the new QAR, released over the Christmas break, is once again highly problematic.

Last year’s concerns were mostly limited to the unhelpful – almost unusable – format of the dashboard. On this point, sector-wide negative feedback has been ignored and the format remains largely unchanged. The dashboard is slow and unreliable, lacks key information and is set up in a way that will cause further delays in producing information that was previously readily available.

But this year has also brought a host of new problems we are struggling to resolve due to the lack of accessible, accurate data.

At this point it may be helpful to insert a quick reminder of the (baffling) terminology changes from last year:

1) ‘Success rates’ (SR) no longer exist. These are now ‘achievement rates’.

2) The old ‘achievement rates’ are now called ‘pass rates’.

3) ‘Learner starts’ are now known as ‘leavers’ (don’t ask).

The latest QAR contains the ‘confirmed’ achievement rates but these seem to be wrong. Some colleges are reporting achievement rates on the QAR’s first release up to five percentage points lower than they had predicted, which could be at least a grade’s difference in Ofsted’s eyes. It seems the reason for the differences is that the software hasn’t picked up on the ‘90-day rule’, i.e. that achievers count if they pass within three months of the end date.

Colleges need national rates at course level upwards to make meaningful quality judgements against their own performance

Another key difference is that we are unable to see either national averages (they have been promised as part of the National Achievement Rate Tables next month), or the overall AR for our college. And while we are able to calculate our overall figure from the 16-18 and 19+ breakdowns, the QAR doesn’t contain the data that would allow us to work out the national overall AR.

Using weighted averages by sector subject area is a concept that even some Ofsted inspectors we know and love struggle with, yet they are the best objective measures of quality available.

We should also be able to see achievement rates for UTC and school sixth forms – we should all be working to the same rulebook.

Other problems include a time-consuming export function, which produces poorly formatted, often unusable PDFs (we’ll have to resort to screen printing) and a lack of clarity about raw data downloads.

While colleges have been supplied with a csv download of the data used in the QAR dashboards for validation purposes, there have been a number of issues with the files, including no date of birth populated to allow analysis by age band and no clear guidance on the filters that need to be applied. This means it takes a significant amount of time to match the QAR data, and colleges are reporting many leaver number differences.

No wonder reports are bland

The QAR dashboard also states that due to “changes in business rules” it is “not directly comparable with last year’s dashboard”, which is singularly helpful for measuring quality over time.

I encourage all information system managers to feed back to the powers that be.

Colleges need national rates at course level upwards to make meaningful quality judgements against their own performance. Without validated data and the ability to compare correct achievement rates with national averages for the same year, how can informed decisions on quality be made both internally and by Ofsted?

How could any college in this year’s Ofsted round be reasonably assessed without accurate 2015/16 benchmarks?

No wonder reports are bland. Consider appraisals like “this college’s performance is in line with the rates for colleges nationally”, which presumably refers to those from 2014/15. Reports used to be informative and give ideas on how to improve. Not now.

My advice? Fight the good fight. We’re not finished yet.

 

Graham Taylor is principal and chief executive at New College Swindon