As the time for the next Chief Inspector’s Annual Report approaches, three members of the Policy Consortium — which provides professional policy analysis for learning and skills — look for messages that may emerge about the past year, as well as some of the possible implications they might have. Colin Forrest, Carolyn Medlin and Mike Cooper (pictured from left to right above) lay down challenges for the education watchdog as it becomes more involved in the improvement agenda.

Crunching the inspection numbers

Highly-contested claims by Ofsted earlier this year that no colleges could be judged outstanding for teaching and learning still rankle throughout the FE sector, as people ask how and why the Chief Inspector and others involved got it so apparently wrong.

This assertion was based on the observation in the Chief Inspector’s 2012 Annual Report published last autumn (and of course covering the final year of inspections under the 2009 Framework) was: “For the second year running, no colleges were judged outstanding for teaching and learning.”

It begged the question as to how this finding from a sample of 40 or so general FE colleges subsequently extrapolated to the college sector as a whole.

In 2012/13, the sample size of colleges inspected has doubled. In the published inspection reports since September 2012, two general FE colleges and two sixth form colleges have been awarded a full suite of outstanding grades for their cross-college aspects — including the key matter of the quality of teaching, learning and assessment.

Will this most recent pattern reassure the inspectorate that making these elements central in the revised common inspection framework (CIF) has had the desired effect?

Some recent comments from Skills Minister Matthew Hancock have been rather more conciliatory — a change in rhetoric that may signal some changes in perception and approach.

It’s hard to tell. But it would seem unlikely.

Reading individual inspection reports suggests what an overview of teaching, learning and assessment might look like but there is little other help.

The inspectorate’s Data View website was launched at the same time as the publication of the Annual Report.

Alas, it doesn’t help much. At the time of writing the data set relates to March 2013.

Although the data can be presented by provider type, number of learners, organisations, and local authority, it covers only the overall effectiveness or leadership and management grades.

Moreover, apart from schools, it doesn’t tell us anything about the quality of teaching, learning and assessment.

The graphs in Data View show the proportion of providers in each of the four grading categories at August 2010, 2011, 2012 and as of March 2013.

Strangely, the plotting points are joined up, suggesting a smooth trajectory between those dates — a somewhat unlikely pattern for any change, and perhaps still more so in such complex and shifting landscape as FE and skills inspection.

For overall effectiveness, the proportion of colleges with good or outstanding grades increased from 63 per cent in August 2011, to 64 per cent in August 2012, and then to 70 per cent by March this year.

For leadership and management, the proportion judged good or outstanding has increased from 66 per cent in August 2012 to 73 per cent as of March 2013.

Data View does report the proportion of schools with the different grades on the quality of teaching, alongside the other grades. The proportion with good or outstanding quality of teaching is 73 per cent at March this year. However, it doesn’t do so for FE and skills providers — making what would be a useful comparison very difficult to see.

Sharper than Data View

Contradictory messages and conflicting priorities set by Ofsted are in danger of leaving colleges at best confused and at worst without the essential support they need.

Although its year old Data View system lays claims to simplicity and transparency, it is too blunt a scalpel either for forensic analysis or the surgery leading to healing.

Analysis of the reports published since September last year suggests the proportion of colleges inspected in 2012/13 that are good or outstanding for teaching, learning and assessment is around 56 per cent.

Broadening the analysis to adult and community learning providers, the proportion becomes around 61 per cent for that category. For work-based learning providers it is 51 per cent.

Similar figures in 2011/12 attracted HMI comments like “teaching and learning are not good enough”.

Ofsted reports that teaching and learning needs to be stimulating and demanding, involving real-life scenarios to enhance employability.

Quite how this is to be achieved across the full range of courses and subjects has been not made at all clear.

The inspectorate also highlights that more needs to be done to enhance ownership of learning by learners.”

They further highlight that a vocational context often needs to be emphasised more in links with employers and there are frequent references in reports to the need to meet the best industry standards.

But again, this raises some questions about how effectively and convincingly this could universally be applied.

Subject expertise is seen as essential and the links between the curriculum and the workplace are crucial.

The role of using a range of alternative technologies to make learning “exciting and fun” is sometimes a priority as is the need to develop personal and social skills as well as employability skills.

All good points, but how to prioritise these among all Ofsted’s other concerns?

As if this is not enough, then the inspectorate also highlights that more needs to be done to enhance ownership of learning by learners, in evaluating their own progress and target-setting and developing personal, vocational and functional skills targets.

Employers, providers, and teachers need to prioritise working together with learners to ensure there is shared ownership of targets.

Altogether, it’s quite an agenda. Little of it is arguable per se.

The question is, what to do about it for the best, and at a time where energies, ideas, resources and time especially are all at a premium?

All of this cries out for closer, more compelling links to be made between an inspection regime that often claims somewhat loftily to be concerned solely with making judgements, and not with improvement.

However, other activities undertaken by Ofsted contradict this stance.

It may be that linking providers that require improvement with HMIs for a limited degree of support will change the situation — and the outcomes.

It may be that more of Ofsted’s good-practice reports will help. But there is a way still to go, clearly — and the route looks increasingly awkward and perilous.

Not least because that ‘support and challenge’ initiative is restricted in its application to grade three providers — and was not designed for the absence of an improvement body (which the new Education and Training Foundation (ETF) insists it is not).

Moreover, for the provider graded as ‘requires improvement’, what is the best stance to take when Ofsted return prior to the next inspection and ‘provide support’?

Should the provider ‘fess-up’ warts and all, to help them move forward for the next inspection positively?

Or, should they put their best foot forward?

There are certainly implications with the former.

That is, while the report from the support and challenge visit is not published, a copy of that report is passed to whoever leads the next full inspection.

Does this conflict with Ofsted making judgements and supporting improvement?

Ofsted often argues that self-assessment ends up being unduly complex and ill-focused. If so, could it be the inspectorate’s own approaches and methods are partly to blame?

At the organisational level, the inspectorate argues that the process of self-assessment needs to be systematic in coming to judgements on teaching, learning and assessment, and to incorporate a broad base of evidence including views from those organisations to which learners progress.

How feasible this is as anything beyond a counsel of perfection for most or nearly all providers is a fair question.

Similarly, Ofsted argues that self-assessment can be overly complex and not sufficiently focussed on the quality of teaching, learning and assessment.

This complexity is not altogether surprising, given Ofsted’s previous approaches and methods, and the size and complexity of the CIF itself.

That links, too, with the reluctance by the inspectorate to commit itself whole-heartedly to the processes of improvement in recent years.

This has begun to change to a degree, but more may well be required to convert judgement to real and positive change.

Ofsted further highlights a need to reduce the variability of the quality within providers, through the sharing of good practice internally.

This is a fair cop — learners, parents and others are entitled to be frustrated or even angered where they perceive such gaps, and explaining them away is difficult.

Nevertheless, there are questions to be asked about how to do this effectively, on so many fronts, in a period of shrinking resource and still greater demands.

So once again, for providers of good faith who dearly wish to address such issues, and even for very good providers, this can seem an almost unattainable goal.

Here’s a related observation. Is it appropriate to hope that Ofsted itself might transfer some good practice, and more consistently model its own ambitions for teaching and training? In particular, since providers come to learn, could there be more of a genuine learner-centredness in how such Ofsted sessions are handled by its own staff?

Some greater degree of better practising what is so powerfully preached might not go amiss.

And then of course there are mixed messages around self-assessment and the report that captures it. Since April 2012, providers are no longer required to upload a self-assessment report onto the Provider Gateway.

But, if they don’t and their data shows drops in performance, this could trigger an inspection.

So, in reality, if providers choose not to upload a self-assessment report, they may be more likely to be inspected.

The litany of complaint continues. For instance, weaker lesson observation processes are criticised for focussing on teaching rather than learning. There is potential, too, for learners to be involved more in organisational self-assessment and the impact of this involvement to be recognised and captured.

Both of these statements are certainly true in themselves. But to adapt a very useful response recommended by Ofsted to mere assertions with which they are presented in self-assessment, ‘So what?’

It is a challenge to move beyond the mere reporting of a fact to significance — how best to make the connections with the ways that it might be made an improvement reality.

Increasingly, then, these are significant and pressing issues. Not merely for the new ETF, the wider sector, or for individual providers, but for Ofsted itself with its new improvement remit.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *