Is linking funding to outcomes more trouble than it’s worth?

Using learner destination data to determine funding for FE is “fraught with difficulty,” according to Geoffrey Stanton.

The recent Skills Funding Statement indicates that the Department for Business, Innovation and Skills (BIS) is considering “how funding can be more strongly linked to outcomes in future”.

In post-19 FE and training, funding is, of course, already strongly linked to outcomes — outcomes in the form of qualifications.

For 16 to 18-year-olds on the other hand, this approach has now been abandoned following the Wolf Review.

It is to its credit that BIS is now recognising there are problems in just using the achievement of qualifications as a trigger for funding in the adult FE sector, but using learner’s destinations for the same purpose is even more fraught with difficulty, and it looks as if this is what BIS has in mind.

There has been ongoing confusion in policy circles for decades about the use of outcomes in education and training. Are they usable as quality measures, or just quality indicators? Could they even be used a triggers for funding?

Efficiency came to mean avoiding the teaching of anything that would not be tested, and avoiding the recruitment of learners who might take longer than average to achieve

At one level the greater focus on outcomes has been hugely beneficial — not least as a counterbalance to an emphasis on where, how and for how long someone has to learn before their achievements can be recognised.

It was a good idea for NVQs to link qualifications to occupational standards, what a job required someone to do and understand.

This benefitted many adults already in work. But it all went wrong when NVQs were used as a trigger for funding.

All too often efficiency came to mean avoiding the teaching of anything that would not be tested, and avoiding the recruitment of learners who might take longer than average to achieve.

There was also a disastrous confusion between designing accurate standards for the occupation and designing effective learning programmes for individuals.

The former was funded and prioritised, the latter was not. It has taken until last year’s Commission on Adult Vocational Teaching and Learning to start to redress the balance.

Of course destinations matter, not least to the learners. I have no problem with encouraging providers to make strenuous efforts to track destinations, and to publish the results to applicants and to Ofsted. But the practical problems of a link to funding are obvious.

How long a period should be allowed to elapse after qualifying before the destination affects funding?

Promotion, a successful job application or a career change may not come immediately. And how enduring should any destination have to be? If someone gets a new job in a prioritised occupational area, but loses it within a month or two, should there be clawback?

There is also the problem of keeping track of individuals post-course. Email addresses may prove to be more stable than postal ones, but recipients cannot be compelled to respond.

The 2012 report Social Market Foundation (SMF) suggested that individuals could be tracked through the tax system. This would not identify the occupational areas concerned, but in any case the SMF was understandably sceptical about any government’s ability to forecast what should be priority areas.

The tax system would if course identify wage rises, and the SMF argued that these would necessarily indicate overall productivity improvements.

However, this required the making of heroic assumptions about the rationality of the labour market and the geographical mobility of workers, among other things.

What proportion of funding should be linked to destinations?

If the incentive was small it would be an irritating piece of noise in the finances of institutions.

If it was large it could destabilise some good institutions because of the need to fund provision up front perhaps years before they could be sure of the level of income from government.

Finally, there is always the “sauce for the goose and gander” test.

Why not road test the approach by applying it to universities or schools first?

I look forward the day when a minister writes to a university vice-chancellor with the news that the funding for an expensive engineering degree has been reduced because its graduates ended up in financial services rather than manufacturing.

Geoffrey Stanton is an independent consultant and visiting fellow at the Institute of Education

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *