The FE Choices survey into student satisfaction isn’t worth the bother, according to one of the country’s largest colleges – which has refused to take part.

The Skills Funding Agency published the results of its annual learner satisfaction poll in June, showing that private providers were more popular amongst learners than colleges, scoring 89.6 per cent, rated 9.2 per cent higher than colleges, at 80.4 per cent.

The percentages reflected the median score for 189 colleges and 276 private training providers, with feedback counted in the data.

Survey results will also now be used in a new online tool for the first time from this year, to help employers choose apprenticeship providers.

But Karen Dobson, principal of the 9,000-student Newcastle-under-Lyme College, claimed her organisation had stopped participating because it had a “limited purpose”.

She told FE Week: “We no longer participate in the FE Choices survey as we found the task was becoming increasingly labour-intensive and there is limited awareness of its presence and purpose amongst our core target audience.

“We have a number of larger, more efficient college-led surveys in place which allow us to benchmark performance and student satisfaction.”

The SFA’s survey is designed to “capture learners’ experiences” of their college or training provider through questions ranging from “how satisfied or dissatisfied are you with the teaching on your course?” to “how likely is it that you would recommend the college to friends or family?”

Another large college which did not complete the survey in 2015/16 was City of Bristol College, which caters for around 25,000 learners.

A spokesperson told FE Week that the timing of the SFA’s survey “coincided with a period of significant change at the college, including staff changes at all levels”, which meant the college missed the completion deadline.

Meanwhile neither Lambeth College, which educates more than 11,000 learners, nor the Manchester College, which has more than 25,000 students, managed to “provide enough data” to award a score in the survey despite their large size.

Neither college was able to comment by the time of publication.

learner2

Of the colleges that did take part in the survey, Stanmore College scored the lowest, with a score of 51.6 per cent.

A spokesperson said that the college’s “disappointing” response rate (14 per cent) does not “fully reflect the view of students”.

Barnfield College, in Luton scored the second lowest with a score of 57.1 per cent.

A spokesperson said it “was in the midst of difficult and challenging times” at the time of the survey, but recent internal surveys revealed a “positive shift in our learners’ experience”.

North Warwickshire and Hinckley College was the third lowest college with a score of 61 per cent.

A spokesperson said it was a “disappointing result” and put it down to working as part of a federation with South Leicestershire College during 2015/16.

Tameside College in Manchester scored the fourth lowest at 64.2 per cent, but said its low response rate – 19 per cent – “doesn’t not reflect the positive feedback we receive from the significant majority” of students.

South Essex College was the fifth lowest with a 66 per cent score, but pointed to a response rate of 18 per cent.

A spokesperson told FE Week the SFA’s survey is “very limited and does not accurately portray student satisfaction with most colleges having a low response rate”.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

6 Comments

  1. The reason that colleges or ILPs do their own surveys is to get their own quantitative data of what their learners think of the service they are receiving so they can measure and improve it (teaching, learning, assessment, support, enjoyment, recommendation to others, how safe they feel, knowledge of different aspects, etc.). It is also important to find out if there is anything that learners, because of their experience, feel needs to be improved by having an open question where they can say exactly what it is they feel. Anyone who has a decent quality improvement system usually conducts a survey after induction, at a mid-point in the training programme and towards the end. This gives them specific data that supplements observations and learning walks, telling them if things are getting better or worse or being maintained at a high level. It is best to feed back to learners what you have done about any negatives so they can see the point of completing surveys, otherwise ‘why bother?’. No matter how well surveys are conducted, learners will get fed up of giving their views more than three times a year. So carrying out the SFA survey will impact negatively on your own surveys. On top of this Ofsted have been sending out their own survey that clashes with the timing of most providers induction survey (seems to be done in the year you will be inspected!). So yes, it is totally understandable why a good college or ILP would want to stick to their own survey model and not have the effectiveness diluted by the needs of outside bodies. It is very difficult to get really good engagement from employers in surveys, so the same applies to any external meddling when you want to get your own feedback to meet the needs of your employers. I once tried several years back to explain all of this to someone designing a previous funding body survey, but all they cared about was ticking their own boxes, regardless. I wonder how much the whole SFA survey machine costs and what impact it has on the quality of what is being funded?

    If I was advising the SFA about the quality and ease of understanding of their questions I would not give them very flattering feedback. For example, for a score out of 10 (great for a statistician, less so a respondent), Q6 – ‘How satisfied or dissatisfied are you that your college/ learning provider/ the company responds to the views of learners/ apprentices and employees in training’. And what is the difference, if you were a learner, between Q5 – ‘How satisfied or dissatisfied are you that the course/ training programme is meeting your expectations?’ and Q7 – ‘Overall, how satisfied or dissatisfied are you with your college/learning provider/ the company’s training programme?’ Such friendly language and distinctly different types of information that are being sought? Honestly, if you were a learner, how would you approach the SFA survey and what is it’s validity?

  2. Hot topic

    The FE choices survey isn’t highly regarded amongst the people in my sphere, partly because it’s tainted as being a hangover from the Framework for Excellence.
    It could be better administered and delivered more effectively, but I would be concerned over the sector being held to ransom if it moved from public to private hands – you only have to look at course listings to imagine how much money could be sucked out of the sector.

  3. I agree with the above comments. Perhaps the SFA and Ofsted should get together (gasp) and ask employers and learners what the survey questions should be and then come up with 1 survey. We get asked about Learner Voice and how we respond to their views. Shouldn’t they?

  4. Dave Spart

    Your caption to the table states it shows “The five lowest scoring *college’s* in the…survey” and you quote Tameside College as saying the result “doesn’t not reflect” their learners’ real views.

    It’s not only the colleges that need to try harder!

  5. I used to work for one of the worst performing colleges in this table. The result comes as no surprise to me and the response from the college in the article just exemplifies the view and understanding of the management and disappointingly the teaching teams.