Mick Fletcher, visiting research fellow at the Institute of Education and member of the Policy Consortium, casts a critical eye over Ofsted’s new Learner View website.

Ofsted has officially launched its Learner View website — an on-line mechanism for collecting feedback from students about the institutions they attend.

The Daily Telegraph was predictably delighted. Under the strap line “teenagers will be able to shop sub-standard colleges,” it explained how the Trip Adviser-style service could be used by the education watchdog in making judgments about an institution.

No doubt the move will be welcomed with varying degrees of caution in the sector. Listening to the learner voice is in keeping with the zeitgeist, and no one wants to oppose greater transparency.

More importantly, for an institutional leader, contradicting Ofsted in public is a little like challenging Robespierre in his prime. All the more important, therefore, for those not on the payroll or under threat of no-notice inspection, to point out the serious flaws.

At worst Ofsted’s Learner View website will become another dodgy statistic”

Firstly, it is not clear how many completed questionnaires will be needed before Ofsted publish responses for an institution — the website doesn’t say what the minimum is.
Ofsted may be waiting to see how many people use it. A higher threshold would clearly give more credibility, but equally it would look bad if few institutions achieved it.

It is not clear whether the threshold is linked in any way to the size of the student population. Ten returns from a provider with 40 apprentices is impressive, but twice the number of learners from a college with 20,000 enrolments would be meaningless. So will Ofsted report the size of the student cohort alongside the responses to give context to the comments?

Crucially, it seems there is no way of knowing how representative a sample those responding are.

According to the Learner View FAQs, respondents must give an email address and a password and they may report on up to three providers. They do not appear to give details of the course they attend or their personal characteristics, or even whether they have finished the course, are halfway through it or only just begun. Without such data, it is difficult to begin to interpret the significance of the results.

One thing we can be clear about, however, is that those filling in the form will not be a random cross-section of the student population. Those with a real or imagined grievance will be over-represented as will, in some cases, those encouraged to give favourable responses. At worst it will become another dodgy statistic.

The lack of course level information and analysis is one of the major flaws in the scheme. For most students, particularly part-time ones, it is the course rather than the institution that matters. And most of the questions they will be asked relate to the course. For a prospective student, it’s a bit like general Trip Adviser ratings for, say, restaurants in Madrid, or accommodation in Bangkok, rather than anywhere specific you might want to look at.

Why then, with all these problems might Ofsted be doing this, particularly when the well-established Learner Satisfaction Survey does a similar job far more professionally?

The suspicion hanging over Ofsted has to be that it is simply attempting to curry favour with political masters. Their view that the market is best is so deeply ingrained that the public sector is regularly visited with caricatures of  market mechanisms. Think, for example, of the endless stream of accounts and vouchers purporting to empower the user, or information on choices that no one asked for and few use.

The truth that even Michael Gove concedes in relation to exam boards, is that the market doesn’t always work in education. It’s not a commodity that is bought, but a process in which you engage.

Learning requires effort and commitment from learners as well as teachers. To judge it as you would a weekend in Benidorm is deeply demeaning.

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment

  1. Mick – I share your cautions though I wouldn’t want to dismiss survey-based feedback out of hand. When I was actually in a college, we used them a lot and got quite balanced responses (with the odd loonie outlier of course). We continue to do student surveys (for research purposes) and these certainly look like a normal distribution of opinion.

    As you suggest, the important question is the weight attached to this evidence in comparison with everything else. I think I’d be urging colleges to be surveying their own students extensively to provide a more relevant and statistically valid counter weight to the potentially nut-job dominated contributions on a public website