Phil Hatton rebuts claims by Ofsted chief inspector Sir Michael Wilshaw that the FE sector was “inadequate at best” and questions the consistency of the education watchdog.

The chief inspector does not understand first-hand what it is like to work in a sector that is often a ‘second or third chance’ for those who have not achieved sufficiently.

From what was reported, these are personal opinions that appear not to be based on facts derived from the primary source that should be inspection evidence.

Those who conscientiously work for Ofsted must be embarrassed by these latest pearls of wisdom cast down to the sector by Sir Michael.

Of course, part of this is down to him not understanding that a real inspector should only make a public judgement if it is based on fact and as such can be proven by evidence.

So Ofsted, please show us where data on inspections supports the headline statement in the article that the FE sector is “inadequate at best”?

Strangely, this completely contradicts the statistics contained in the chief inspector’s report, published only a few months ago.

At this moment in time, I would seriously question the consistency of Ofsted in terms of being led and ‘managed’ as a champion of quality.

I can think of a number of examples of variable performance by Ofsted.

Firstly, with publishing reports on time. A college report was published in January, over seven weeks after completion, with the target of five weeks missed (there are many more examples of no contact made by Ofsted to acknowledge dates agreed for accuracy checking not being met).

Secondly, wasting taxpayers’ money and resources. For a report on a small private provider published in December, five inspectors took three days to inspect 22 apprentices. Is inspection resourcing really being well managed?

Thirdly, equity of resource allocation. A college was inspected under the new CIF, with 1,038 apprentices, by one inspector for four days. How can this hold water with the above case?

Fourthly, website information not being available, data dashboard questions for Learning and Skills governors, link not working.

Fifth, consistency and checking of judgements in the much simpler report structure. A ‘safeguarding effective’ judgement made for a training provider report contained the judgement that ‘staff are not trained in the Prevent strategy’.

Sixth, carrying out inspections on time. A Sixth Form College that required improvement, with a latest date to be inspected in the 18-month window up to early May, was actually inspected in the first week of December, with no apology

Seventh, judgements about apprenticeships nationally in a survey. Many ‘apprenticeship training agencies’ have gone uninspected, or judged as to their ‘fitness for purpose’, despite being around since 2009 and involving thousands of apprentices and millions of pounds in funding.

Although Sir Michael did well in the past as executive principal at Mossbourne Community Academy, in Hackney (although the best college principals that I meet, give the credit to their staff, students and the ethos created), what about the pupils it failed, who did not stay on in that sixth form?

Yes, they probably went onto FE to pick up the pieces, but with far less funding.

I thought the concept of a ‘level playing field’ was finally acknowledged in the recent chief Inspectors report, or does making a controversial soundbite statement reflect a new un-evidenced approach to Ofsted judgements?

Is there real equity in the implementation of the CIF?

Where are similar statements about schools? Looking at national first time pass rates for English and maths GCSEs in schools, too many young people do not achieve A-C grades and the gaps in success between males and females, and between regions are too wide.

Locally to me there are many outstanding schools. Are the same criteria equally applied as in FE?