The Ofsted director of FE and skills spoke with FE Week deputy editor at the launch of the education watchdog’s 2012/13 annual report.

Chris Henwood: The report mentions an initial evaluation of 16 to 19 programmes. Will that come as part of inspections, or will you be doing something separate?

Matthew Coffey: I am always looking for good value for public money in terms of my inspections, so my inspectors will be clear when they carry out the inspections that we are carrying out a survey to evaluate early on the study programme. So evidence that’s relevant will clearly come into the central team to be able to claw that together and do what we call ‘retrieval’ — looking back at what inspections reports are saying. But equally, we will be carrying out a number of visits to providers where we perhaps aren’t going to see them through an inspections cycle — and that’s generally the outstanding providers and the good, and I think it’s really important that we don’t just focus all of our evidence of how programmes are being introduced on the back of those that just require improvement.

CH: And when can we expect the results of that evaluation to be out?

MC: Well, it’s going to be beyond the summer, because of course these programmes are very new; we are very concerned about the lessons that are learned from previous programmes.

So it’s going to be after the summer when we have got sufficient evidence to be able to pull it all together.

CH: The report says there are grounds for optimism. What exactly is it then, in practical terms, that the sector has been doing right?

MC: We are seeing an awful lot more use of innovative technology. We are seeing iPads absolutely everywhere, and tremendous use of virtual learning environments, which is great for learners, particularly those with employers that are isolated, and to be able to interact in that particular way is great.

I think there’s something else in there though, that we did in September last year when we had a change of inspection framework, we reduced the notice period from three weeks down to two days.

Teachers are telling me that has relieved them of a tremendous amount of stress, and I like to think teachers are getting the message that what we want to see is what they do day in, day out. We don’t have a preferred method of teaching, or a construct of a lesson; what we want to see is how information is imparted in a way that learners really get it — and that’s what we’re seeing more of.

CH: Ofsted chief inspector Sir Michael Wilshaw has been critical of success rates in the past, to the extent of calling them “palpable nonsense”. Has anything happened with those?

MC: Ofsted uses data that is publicly available. So success rates are a measure that is out there, well established, and people have been using them for a good number of years. So, as part of our framework — but only as part of our framework — we make reference to success rates, but we are really interested in the progress that learners are making, and more and more that “So what?” question. Where are learners going? What’s their destination? What are their chances of employment or sustained employment, or into an apprenticeship? So we have been critical of success rates because I think there has been an over-reliance on them, and I’ve put Ofsted fairly and squarely in there when we look at previous inspection frameworks and there has been an over-reliance on it. We have really called for destinations to come to the fore. We’ve got to continue to work with the government and their agencies to make sure that these are a deeply embedded, accurate measure of the impact of the sector, rather than saying, “They’re not good enough,” and just leaving them and reverting to type, which is success rates.

CH: And how do you see destination data – assuming it can be robust and verifiable – taking a role in Ofsted inspections? Does it have its own section in reports, for example?

MC: Well, in outcomes for learners anyway we make reference to the use of destination data, and I guess we future-proofed the framework because we have been calling for destination data for a long time. And we know that some of the best providers out there — the good and outstanding providers — have their own destination measure collection techniques anyway.

And I know that the sort of leaver codes of individual learner records have never been a reliable source of information — I think they get filled in about 20 per cent of the time anyway, and it’s only a ‘where are you going to go tomorrow’ rather than a robust ‘where have you been’, so in the good and outstanding providers, they do that — they follow up with individual learners, and it’s an arduous task. I think it’s going to be a game-changer for schools, and I think it will be a game changer for colleges and independent learning providers.

CH: The report mentions Local Enterprise Partnership (Lep) involvement. Do you have any recommendations to make that happen?

MC: I think there is a challenge for government in the annual report, that there is now a need for clarity about the role and purpose of the Leps. I think Leps have had a huge amount of expectation loaded upon them in the last year, and I think it’s now important that we start to have some clarity about what their role is going to be and how they are going to relate to FE and skills providers. We laid a challenge out in our accountability report that colleges needed to get on the boards of Leps, and I’m getting some reports that this is beginning to happen. In the Ofsted regional structure, every time I meet with local authorities I am talking about their role with the Lep and how they can support and facilitate, and certainly talking on behalf of the South East region, I know that my senior team has been out and met every Lep that’s out there — so we’ve got a role to play to facilitate the bringing together.

I think the final thing to say about how we might play a role to facilitate this even further is that in April I’m going to launch the Governors’ Dashboard [Data Dashboard] for further education and skills, and in that, there is a section that lays out the curriculum areas that learners are sat in, in that particular institution, against the local Lep strategic priority areas. And that will enable a conversation to happen between governors, or trustees, or those in charge of teaching and learning to say, “Why does our curriculum not match?” or, “It does”.

CH: The report mentions employer providers last year and it’s quite positive about them, but their inspections of late haven’t been very good. Bearing this in mind, do you have any concerns about a move towards greater employer involvement in the FE and skills agenda?

MC: I don’t see it as threatening that some employers may well be getting themselves together, taking public money, and engaging with providers and experts in their field in the delivery of training — it’s been happening for a very, very long time.

Not everybody is performing at the highest level at this moment in time, but there is something quite unique about the independent learning provider sector — and that is generally that action gets taken very, very quickly when there is inadequacy. So there are fewer inadequate providers, because they lose their contract. They tend to get one shot at this, and if we come along and say it’s not good enough, they don’t get another shot. I’m very cautious of subcontracting, I think that the minimum contracting priorities of old have made some difficult decisions, where you had some very small, employer-based providers delivering very high-quality apprenticeships, but because they didn’t reach a threshold, they were, sucked into a large vacuum of nothingness, in many cases – and I think that we should avoid that at all costs.

CH: Some might say employer providers don’t necessarily quickly improve. If you look at G4S, it got an inadequate grade having got the same grade at the previous inspection, six year before.

MC: I don’t want to get into individual providers, but the idea is that — and I think it’s probably worth clarifying what I said in relation to independent learning providers — is that, when we come and make our judgements, most generally the funding bodies don’t hang around and wait for these providers to get better, they take a decision.

Now, that kind of decision-making process is entirely up to the Skills Funding Agency — but in the main, what we have seen historically is that they tend to lose their contracts, and I’m sure they continue to work with individual providers where they feel that there’s a niche and they feel that there’s capacity to make that improvement, and what we’ve done to support that even further is that, from September of this year, we have introduced a system of more frequent monitoring of inadequate providers, whether that’s a college, an employer or an independent learning provider… so when we judge it to be inadequate, we’ll come back very, very quickly, and we will all be publicly able to see whether they are making the progress that they need to make in order to get out of that category very quickly.

CH: The report mentions big colleges falling into inadequate – there’s Liverpool, Bristol and Coventry – and the theme that seems to be that they’re big city colleges, and previously Ofsted has said there was an issue in London, which FE Week covered. So is there a problem with big city colleges?

MC: In response to last year’s annual report, Ofsted and the Association of Colleges worked together to pull a network of urban colleges — it wasn’t just London but the majority clearly were in London — together to see what were the common issues such as attendance and all those kind of things, and to bring together this group, led by us to start off with but now very self-sufficient, about sharing good practice.

So we did an action learning study, that we published via the AoC, and what we have started to generate is a real culture where the providers will get themselves together and say, “This is particular problem,” and somebody will say, “It is, or it was for us, and this is how we solved it.” And people are going and visiting each others’ colleges. Where there are challenges that are similar to everybody, then the ability to get together and to see how a solution might evolve, is really helpful — and this was a very good example of where sharing good practice via Ofsted I think created a real change in culture and attitudes.

CH: So is there an issue for Ofsted with big city colleges?

MC: There is a concern, always, about individual and large institutions that fail.

CH: The report says most apprenticeships are going to people older than 25, and for 16 for 18s it’s just not really taking off. Does Ofsted have any recommendations to deal with that issue?

MC: Yes – I think there are a number of things in that. First of all, we’re reporting the facts as they are; secondly, I’m disappointed in them… I’m disappointed in them in one respect, I’m encouraged in another, that 63 per cent of all applications for apprenticeships are coming from young people. That tells me that the message is getting out there, and I don’t think we should kid ourselves. There are many barriers to this, it’s not about the quality of education, it’s often about the quality of careers advice, but if you look at our careers report, you will see that 70 per cent of parents of secondary school children wanted them to go the traditional route of A-levels and university, and that’s about knowledge and understanding, so the fact that 63 per cent of applications come from young people tells me that we’re kind of getting over some of that; that’s really, really important. The challenge back to employers is get yourselves on the governing boards of schools and colleges so that you can influence at that level. So a whole number of things. It’s very early with the traineeship, but I think the traineeship is a very welcome innovation that can help to bridge that gap, to give the confidence to prepare more young people ready to take on board an apprenticeship. And hopefully over time we are going to see that change, which is why we are going to focus on it next year.

CH: Should all 16 to 18 provision be directly aligned with local skills needs, or is there space for delivering what that age group says it wants?

MC: I think we have got a very learner-led curriculum at the moment, and while freedom of choice is absolutely right, people need the clarity of what their career options and the likelihood of them getting a sustained job are going to be.

How many of us wanted to be a pilot of 707s when we were young? But you just need to understand… “That’s great, but we’re not going to stick you on a course to be a pilot because there aren’t enough jobs, and you ain’t ever going to make it, Matthew.”

So we need that level of discourse, and the reality.

CH: How are independent learning providers going to develop key measures of impact, and how can you make judgements on provision without that?

MC: I see the same destination measure as being equitable for schools, colleges and independent learning providers — and the question is, it’s as relevant for independent learning providers as it is for anybody else. You might argue that they’ve got an advantage in terms of they are generally working more directly with employers, and they’ve got apprentices who have got a job… and again, last year, and in our report on the quality of apprenticeships, I think there is a challenge to all providers — and I’m not singling independent learning providers out — I am talking about apprenticeships, and we do see… this year’s annual report talks about 9 per cent inadequacy in apprenticeships, and this is about not filtering out the employers that really are looking for perhaps inexpensive labour.