If you saw FE Week’s front-page headline “Private providers ahead of FE colleges in government learner satisfaction survey” and also visited the FE Choices website ( http://fechoices.skillsfundingagency.bis.gov.uk/ ) you might be puzzled.
Yes: you can bring the data together as FE Week does; but FE Choices doesn’t compare provider types like that and isn’t meant to. Instead, it lets you look at individual providers, and a number of scores for each – learner satisfaction is just one – allowing comparisons against the worst, median and best providers of the same type. The website is designed to be useful to employers and prospective students comparing individual institutions, and may therefore send a wake-up call to under-performing providers.
The data is limited. It only sorts learner satisfaction data by age and level of qualification, not programme area and doesn’t differentiate employer satisfaction at all. Providers with very small numbers of respondents are excluded from the calculations.
This applies to the majority of the private providers serving my own area, including one major national player. It seems that even when calculated, these indicators are commonly based on under one-third of learners.
Providers who have the Training Quality Standard are exempt from the employer satisfaction measure, which reduces its value as a benchmark.
The data actually tell a rather more positive tale about colleges than the FE Week commentary. Their lowest median learner satisfaction score is over 7 out of 10 – I’d normally be delighted to recommend a service rated at that level.
Median level colleges typically had only 7 per cent of users or fewer recording satisfaction ratings of 3 out of 10 or below. More importantly one can’t make any serious comparisons between FE colleges and private providers without like-with-like comparisons – learners from similar backgrounds studying the same types of qualification.
Comparing the employment rates of learners at the different provider types is seriously misleading. In colleges higher proportions of learners want to continue education on completing their current course, including entering university.
This is confirmed by the learning rate data, where the comparison with private providers strongly favours FE colleges.
The combined learner destinations rate is a far better measure of comparative performance and shows an equally high median score (82 per cent) for both colleges and private providers.
To put the learner satisfaction data in context, it helps to see them in relation to those for past years, and for other sectors.
The FE Choices data indicate a marginal deterioration in the median learner satisfaction rating for FE colleges (and for employer satisfaction across all types of provider). Whether this constitutes a trend is too early to say.
The last National Learner Satisfaction Survey commissioned by LSC in 2007 revealed that 90 per cent of FE learners and 91 per cent in WBL were fairly satisfied or better, with 27 per cent & 26 per cent respectively extremely satisfied. The equivalent numbers expressing any dissatisfaction were only 7 per cent & 6 per cent.
FE compares well with equivalent data for other sectors, The 2011 National Student Survey for higher education indicated that 83 per cent were satisfied overall, and 8 per cent dissatisfied.
Many of the highest ratings, incidentally, applied to HE in FE colleges. The most recent customer satisfaction data for retail banking and for energy supply are much less positive.
The real story ought therefore to be that learner and employer satisfaction levels across all types of FE provider are reassuringly high.
Setting one type against another risks confusing this message, for the media and the general public alike.
FE Choices may help inform user decisions and assist providers to improve quality; and publishing this information encourages providers to help make the data better, as quickly as possible.
Peter Davies, researcher and
consultant, Policy Consortium