The government needs to establish a set of FE-centric measures to get a better idea of what it wants from the sector. Ali Hadawi has a few suggestions

Two new Ofsted reports out this week have raised the proportion of colleges now rated ‘good’ or ‘outstanding’ to almost three quarters. There is no denying that this is great news for the colleges themselves, and it may even say something about the teaching and learning happening at them – but how much does it signal success for the FE sector as a whole? Is Ofsted really measuring what we think it’s measuring?

If we’re honest, the government currently has no viable means of measuring what it actually wants the FE sector to do.

There are a number of proxy measures, however. These centre on the quality of teaching and learning, outcomes for learners, success on courses, leadership and management, financial health and so on. However, they do not offer a robust measure of the real impact of FE.

The government currently has no viable means of measuring what it actually wants the FE sector to do

They are used for two reasons: the absence of a truly clear mission for the sector, which would normally drive the assessment of impact, and the perceived difficulty in assessing impact.

The proof that these proxies lack relevance is evident from the discourse around the success or otherwise of the sector.

For example, ministers never challenge us to deliver a set number of courses or qualifications, or on our inspection outcomes, or on the level of spend from our grants. Instead, ministers invariably quote CBI statistics on “skills shortages” and criticise FE for not improving, despite the fact that colleges have never actually been asked to close the skills gap or beat shortages.

While FE inspectors are certainly passionate about quality and improvement, this all begs the question: could Ofsted be measuring the wrong thing?

It is not in the current frameworks for any of the regulation agencies to assess the local or regional skills gap or shortage before they inspect a college. For example, it is not inconceivable that Ofsted could use the skills shortage in a certain region as part of its evidence base when judging the relevance of a specific college; after all, the data is publicly available and regularly updated.

The government needs a dependable metric to enable it to decide on the role of FE, levels of funding and how to use the sector to affect the economy, social cohesion, productivity and other areas of its influence.

Developing an alternative metric to quantify the effectiveness of the FE sector is a complex endeavour. It would require significant research into impact measures and how relevant, transferable across sectors, repeatable, consistent and meaningful they are to those who work within FE and for the stakeholders.

Could Ofsted be measuring the wrong thing?

One possibility is to explore a non-financial and intangible value metric in which social value is aligned to the sector’s mission. Such a measure might challenge the need for the existence of regulatory bodies such as Ofsted in the way they operate now.

The metric would need to offer a dependable measure of how FE affects individuals, communities, businesses, the economy, crime levels, reoffending levels, mental health issues, citizenship, progression into employment, progression into HE, social mobility, meeting government agendas on the economy and employment, skills shortages and gaps in the economy, economic activity, economic competitiveness and productivity, to name a few.

With a robust measure of social value and impact, the government would not need to issue a white paper every time a response to a localised issue is required.

For example, a certain area needs to address skills shortages or focus on community cohesion, all that would be needed is a change in the weighting of the various components of such a measure.

This would allow development of an FE mission which can be utilised nationally, regionally and locally.

This is not about rushing in yet more change for the sector – it’s about starting a genuine discussion about our purpose and values, and identifying the right metrics and outcomes to meaningfully reflect these.

Ali Hadawi is Principal and Chief Executive of Central Bedfordshire College

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

3 Comments

  1. Ali Hadawi, Principal and CEO of Central Bedfordshire College, said “it is not inconceivable that Ofsted could use the skills shortage in a certain region as part of its evidence base when judging the relevance of a specific college”.

    In fact, we already do this. When they go to colleges and skills providers, inspectors assess the extent to which leaders, managers and governors work with employers and other partners to ensure that the range and content of the provision is aligned to local and regional priorities, including where there are skills shortages.

    Paul Joyce, HM Inspector, Deputy Director for Further Education & Skills

    • Ali Hadawi

      That is correct, Ofsted inspectors do ask the question and indeed make judgements on how well colleges work with employers to assess and meet needs. However, the question that this article asks is ‘Are the metrics used to measure the effectiveness of FE fit for purpose? For example, ‘Are qualification success rates and outcomes a good proxy for skills supply into the economy?’ the Ofsted point was an example. So in the Ofsted context ‘if we compare two areas, one with a college judged to be ‘outstanding’ and one in another area where the college is judged to be ‘inadequate’, does it necessarily follow that the skills gap is different in their respective local economies?

      The argument is about what would be the right metric to measure impact. I think the reliability of skills gaps, as they are currently reported, is questionable but the example still stands. If we pose the question as to why skills gaps are not changing in the economy, is it because something is wrong? Is it indicative of failure of one or more parts of the system, be it policy, sector delivery, a regulating body or funding methodology? Or is it rather that the skills gaps are simply not reliable and should not be used when one is debating skills? Just random thoughts!

  2. In relation to Ofsted I think they do a good job but I feel they sometimes do not look at items in terms of a format of giving guidance rather than this was wrong. So often we get informed about items which are in direct contradiction to one another like start and end date ESFA say that the New Standards are not 366 days but 376 days. Ofsted say you appear to do your end date latter this means your learner’s are not on a year. But more why is this? We were told to do it. Then things like most of LEP priorities are set by very large companies not what local employers want. Yes some things have to be but others could be more helpfully dealt with