Quality checking of Ofsted reports is not robust – they are far too variable in both form and content, says former Ofsted HMI Phil Hatton.

Last week’s FE Week story about Ofsted backtracking on its report into Yeovil College exemplifies the way it does not pay enough attention, post-inspection, to ensuring that reports read well and accurately explain what lies behind their judgements.

Reports go through moderation, sometimes by part-time inspectors, for judgements, grades and clarity. However Ofsted does not always ensure publication readiness before the report publication button is pressed.

The now defunct Adult Learning Inspectorate, which merged with Ofsted in 2007, had professional editors in place, who as part of their professional development, went out on inspections to familiarise themselves with inspection procedure and the terminology of further education. They challenged inspectors if anything written was unclear or likely to be viewed as contentious.

The clarity of the resultant reports helped providers understand what they needed to address post-inspection in order to improve. Other interested readers could also quickly grasp what constituted a weakness and eliminate it from their own provision.

Inspection should be done with providers, not to them

Inspectors were also taught to identify but not attribute blame for financial or quality problems. Anything wrong in an organisation is ultimately, fairly obviously, down to the leadership and management.

The quality checking of Ofsted reports is currently not robust enough and while the shorter report format is easier for inspectors to write, less time is now allowed for lead inspectors to get their judgements right.

Front-page summary overviews for ‘good’ providers are variable – one published last week listed nothing but strengths, while another had four weaknesses after the strengths. As many readers only look at front-page findings, these two ‘good’ providers looked fairly different to casual observers.

The ends of short reports sometimes reflect the number of inspectors in the team, but other times just names the lead inspector. One report published in the last few weeks had a clear warning in the header that the report was a draft and not for sharing before publication. Others have had the same warning in a watermark across the report.

But of greater concern is the inconsistency in report judgements. In the section ‘what does the provider need to do to improve further?’ just stating ‘improve success rates’ is not by itself sufficient, when better inspectors break down the actions required in order to achieve this.

How can apprenticeships be ‘good’ if a weakness such as ‘the proportion of apprentices who achieve within the planned timescale remains low’ is present?

How can apprenticeships be ‘good’ if a weakness such as ‘the proportion of apprentices who achieve within the planned timescale remains low’ is present? How about something like ‘no staff Prevent training has occurred’, even though the judgement on safeguarding is found to be effective?

Inspection nominees need to be aware of how to effectively fight their corner by keeping thorough notes of feedback during inspection.

The sector should have reports that adhere to minimum quality standards. These reports are how the public sees providers until their next inspection. At the moment, providers are given two days, regardless of their size or the result of the inspection, to read the report and comment on the factual accuracy.

Some have had extras added, not fed back during inspection, presumably to back up a grade or moderation decision, without the chance to challenge that they would have had, were it raised during the inspection. Any provider comments are then shared with the lead inspector (or should be) to see if they agree to any suggested changes.

I suspect that at this stage, Ofsted takes the attitude that providers are trying to change the report findings, rather than bring them into line with what they were told on inspection.

Perversely, the next time the provider is inspected, the Ofsted team will base much of their planning and improvement judgements on a possibly flawed previous report. Inspection should be done with providers, not to them, right up to report publication.

Hopefully with a new chief inspector, the impact of reports on helping drive improvement will be revaluated. I have yet to meet the leader of any college or independent provider I work with who does not think that Ofsted reports require improvement, especially when used for the purpose of identifying sector good practice.

 

Phil Hatton is lead consultant at the Learning Improvement Service