The long awaited “skills index” is too high-level to contribute to any targeted policy decisions, says Tom Richmond

“The Department for Education (DfE) has not defined what success will look like for the programme, in terms of intended impact on skills levels within the economy, nor what indicators they will use to measure success.” The National Audit Office (NAO) was, as ever, unfailingly polite in its report about the state of the government’s apprenticeship reforms in 2016.

Even so, its message was hardly subtle. Four years after the infamous Richard review kicked off the sweeping changes to how apprenticeships are designed, delivered and funded, the government still didn’t know what it was trying to achieve or how it would know if it had achieved it.

Fast forward to 2019 and the NAO reported that some improvements had been made, as the DfE had started collecting data on earnings and how many apprentices stayed with their employer over time. The NAO also stumbled across something called the “skills index” that was viewed as a proxy measure for the reforms’ impact, but even then the DfE had not set out how its calculations fed into its index or what kind of increase in the index would constitute “success”.

We finally have an answer to at least some of these questions following the publication on Monday of the “skills index” for the first time. The index, which aims to monitor the aggregate value of the skills generated by apprenticeships and classroom-based learning over time, uses four sources of data: the number of funded learners that achieved qualifications in that academic year; the proportion of learners that were employed after achieving their qualification; the percentage earnings returns to having achieved a qualification; and the average real earnings for employed achievers.

The overall trend in the index is unsurprising, yet disheartening. Since the index was benchmarked at a score of 100 for 2012-13, it has dropped to a score of just 73 in 2017-18. Although the score for apprenticeships has actually risen from 100 to 118 over this period, the collapse of classroom-based provision from 100 to 48 has dragged the overall index down with it. Significant funding cuts to FE, with the disappointing uptake of FE loans, have inevitably taken their toll on learner volumes.

The overall trend in the index is unsurprising

In general, apprenticeships present a healthier picture within the skills index relative to classroom provision. The move towards higher-level apprenticeships and older learners seems to have contributed to a slight increase in the average “value-added” attributable to each apprentice (probably through higher earnings), although this shift in emphasis within the apprenticeship system remains controversial.

So what have we learned from this new index? Not much, in all honesty. It is potentially a useful tool in the sense that it captures how many learners start and finish their training, in the classroom and workplace. Incorporating earnings and employment data into the evaluation of vocational education is a sensible step too, as we cannot afford to use precious funds on sub-standard courses that do not benefit learners.

That said, the index is so high-level – one set of figures for all classroom-based learning and another set for all apprenticeships – that it becomes virtually impossible to use the index to make any targeted policy decisions either now or in future.

If the number of apprentices went up, but their earnings went down, should the DfE remain calm or start to panic? How important are employment rates compared to the number of starts? Without the granular data provided by level, sector and age breakdowns, it is hard to see how anyone outside the DfE will be able to utilise this index in a meaningful way.

Perhaps the most telling aspect is that the index has only emerged now, almost seven years after the government’s major skills reforms began. This uncomfortable truth arguably says more about the rigour and substance of the reforms than any index ever will.