Skip to content
17 April 2026

Latest news from FE Week

Colleges face funding squeeze as DfE rations student growth cash again

Colleges and sixth forms in England will once again be forced to absorb the cost of rising student numbers after ministers confirmed they will not fully fund this year’s in-year growth.

The Department for Education said providers taking on additional 16 to 19 learners in 2025-26 will receive only around three-quarters of the funding expected.

This is the same approach the government took last year due to an “unprecedented” number of extra students, with officials citing pressure on budgets as demand continues to grow.

It comes a month after the DfE announced a below-inflation per-student rate rise for the next academic year, with ministers accused of breaking a promise for a real-terms funding increase for 16 to 19-year-olds made in last year’s white paper to ease demographic pressures.

The DfE said today: “There has been another large increase in 16 to 19 funded students this year. This growth is positive for the many young people who have been able to take up opportunities for 16 to 19 education and represents a strong response by the sector.

“However, because of the size and distribution of this growth in student numbers, it does create another year of very high in-year growth. We will fund all students through the lagged student number methodology in future allocations as normal. However, the current growth is significantly above the budget available for in-year payments, and so we cannot fully fund this growth.

“We will provide approximately three-quarters of the funding expected based on arrangements published in August 2025.”

In-year growth provides extra funding to colleges that recruit significantly more students than originally allocated, acting as an exception to lagged funding by offering a partial top-up for additional in-year costs.

The Association of Colleges estimated that colleges are currently teaching around 32,000 unfunded 16 to 19-year-olds due to the demographic bulge.

David Hughes, chief executive of the AoC, said colleges and their students are being “let down once again in today’s announcement by a dysfunctional funding system and a lack of respect which harks back to the dark days of austerity they suffered in the 2010s”.

He told FE Week: “This academic year, colleges recruited 32,000 more 16 to 19-year-old students than they were funded to and did so because they believe in the power of learning to support people in life and in work.

“Today, we learned that the government cannot even find the funding to pay around 50 per cent of the full cost for their courses. Instead, they only have sufficient funding to pay three quarters of their formula, meaning colleges will end up being part-funded at a little over a third of the full cost.”

Hughes added: “The cost of fully funding those 32,000 students would be around £220 million and the DfE formula would probably result in colleges getting half of that, around £110 million.

“But they will now get three quarters of that – around £80 million – meaning they have failed to find £30 million to fully fund their own formula. That suggests these learners and colleges are simply not viewed as high priorities, because no other part of the education system is expected to operate like this.

“At a time when the government is rightly aiming to reduce the numbers of young people not in education, training or employment (NEET), it also makes no sense. College leaders feel that their good will and strong inclusion values have been abused and I worry about what that might mean in future decisions they take when faced with unfunded students.”

Officials acknowledged that the in-year growth decision will be “disappointing” and encouraged colleges that have concerns about the impact of this change to contact their regional officials or the DfE’s customer help centre.

Providers will start to receive growth payments from July.

Speaking and listening exams are failing Gen Z’s anxious learners, it’s time for a rethink

Speaking and listening exam are rarely popular with learners. Currently English functional skills exams are split into three parts: reading, writing and speaking and listening. The first two are fine, the last part needs a rethink.

We are in an anxiety epidemic.  Yet we ask learners to chat with a group of other students, who they have often never met. We ask them to be involved in a group conversation lasting up to ten minutes, followed by an individual talk.

The vast majority of learners are fine with this format. Having witnessed speaking and listening exams where it goes wrong, both on-line and in person, from an informed view I feel that changes are needed to the current format.

Being filmed in a formal setting is not an everyday occurrence.  No wonder it unnerves some learners. To offer reassurance is helpful; however, this cajoling has not stopped learners from ducking the exam.

I’ve seen learners not showing up on the exam day (even with lots of practice beforehand), walking out as they don’t like being filmed, feeling uncomfortable with other learners who make up the group conversation, not liking being asked to take off a baseball cap and being unable to speak in slang.

There are many negatives.  Where are the solutions?

Here are a few to help with exam preparation:

  • Give learners bullet points as prompts
  • Practice sessions to boost confidence
  • Give learners ‘mock’ tests
  • Let them pick a subject they want to talk about in the presentation

This undoubtedly helps; however, regardless of skill level, on the exam day they are on their own.  Nerves and fears can take over.  There’s no shame in this.

Some learners enjoy creating TikTok videos with friends and family. In the comfort of environments they know well, they’re more relaxed. They’re happy to give up social time to do this. It’s a bit of fun.

The serious part of exam filming as an essential requirement is another matter.  Most learners I’ve met are not enamoured with the process.  It’s the albatross of the English exam, with lots of time being eaten up with admin and pastoral tasks. That’s even before we get on to the filming part.

The camera can be a trigger.  What alternatives are offered?

Assuming as a modest estimate that 10 per cent of learners have this problem, can they all be eligible to receive exemptions?

Perhaps learners’ anxiety would subside if they were observed by an IQA from a local organisation? This would remove them from the glare of the camera and calm fraught situations. Staff bias would be unlikely as the visiting observer would not know the learners involved.

This would strengthen bonds between organisations, and would only be required for a minority of learners.

There are tens of thousands of learners currently not in mainstream education.  They do not sit in large classrooms with large groups of learners.  They’re likely to be taught in small groups or individually.  I fear it is this demographic who are most at risk of falling short in their speaking and listening exam.

We make adjustments for various reasons when it comes to reading and writing exams with the intention of maximising learner success.  This is right.  Why not also support those who find their speaking and listening exam daunting? Whilst we live in a digital world, we cannot expect all learners to have the same attitudes to video-recording.

We have to make the exam more inclusive for all our learners, and make sure that their individual needs are met.  With a few alterations this exam can be made even better, giving all learners the best chance of passing with reduced friction.

 

 

 

Providers aren’t failing by accident, they’re designed to fail

The sector has watched provider after provider collapse under a crushing ‘inadequate’ Ofsted grade and DfE contract termination. Each time, we blame poor leadership or funding cuts. As a former internal quality assurer (IQA) at a recently liquidated provider, I suspect the root cause lies deeper: a flawed organisational design.

Research shows that when structure does not match strategy, it can lead to fragmented execution. Essentially if a provider’s structure is not aligned with its strategy, it is doomed to fail from the start.

The typical response to standardising quality is to centralise control structurally. Quite often, this takes the form of a rigid functional hierarchy where quality and delivery form isolated pillars. In theory, this creates a clear chain of command and uniform compliance. In practice, it risks a fatal disconnect.

Leaders rely on vertical reporting lines rather than the ‘seams’ between departments. Research warns that having strict functional silos in the ‘middle line’ prevents horizontal communication. For quality, this blocks effective and prompt distribution of pedagogical best practices, which is fatal for standardisation.

When interacting with frontline staff from an isolated quality department, the relationship can feel artificial. Quality ceases to be collaborative, real-time coaching and instead devolves into an external, retrospective policing exercise.

This structural divide precipitates a toxic, defensive environment. It exemplifies Goodhart’s Law: when a measure becomes a target, it ceases to be a good measure. By treating quality as a separate compliance metric rather than an integrated teaching practice, providers reduce the complex art of education into a checklist exercise.

A ‘tick-box management’ approach becomes necessary to survive top-down scrutiny, and tutors are pressured to prioritise administrative paperwork over actual learner development. This risks deskilling, high turnover or worse.

In 2018, an investigation into 3aaa found that achievement rates had been artificially inflated by over 20 percentage points through manipulation of learner records. Crucially, evidence emerged that colleagues were aware of and attempted to correct the changes, only to find them reversed. The structure made honest internal upward challenging impossible and led to 3aaa’s collapse, after its £16.5 million government contracts were terminated.

A structural problem cannot be effectively fixed using superficial cultural interventions. Surviving the rigorous demands of regulatory frameworks requires us to dismantle these isolated pillars.

Training providers should advocate for a shift towards agile, cross-functional delivery pods under a matrix structure which creates formal reporting lines across two dimensions simultaneously – quality expertise and operational delivery. By embedding dedicated IQAs directly within respective delivery teams, the quality team are no longer external auditors ‘policing’ teaching practices, but rather collaborative partners working towards the shared goal of providing high-quality education. Functionally, IQAs retain a strict reporting line to a centralised, independent quality department, which handles summative audits and external regulatory reporting.

This creates a dual reporting line: vertically to quality and horizontally to delivery. This allows IQAs to retain the objectivity to challenge poor practice sitting within a delivery team and preserve professional independence. Quality owns the standards, and delivery owns the relationship.

As a result, information pathways are shortened and standardised best practices are shared promptly. Importantly, it mitigates the need for tick-box management with proactive, real-time coaching that functional silos prevent.

A matrix structure does carry risks: dual reporting lines can blur authority, embedded IQAs may feel pressure to soften findings and coordinating standards across multiple pods demands stronger oversight.

However, the alternative is worse. In functional hierarchies intervention often arrives too late, as demonstrated by 3aaa and other similar cases. The risks of coordination do not outweigh the certainty of failure ingrained within the status quo.

This approach ensures delivery has innovative freedom and quality retains its independence, whilst preventing the ‘policing’ dynamic that undermines both. We need to stop treating quality as a separate entity by dismantling these silos and instead building structures that constructively support the individuals delivering education.

 

 

Learning doesn’t fit neatly into hours, so why do we force it to?

If you’ve ever delivered the level 3 award in education and training (AET), you’ll know one thing straight away: no two learners are the same.

Some walk in with years of experience. They’ve already been training colleagues, running sessions, or mentoring staff. Others are completely new and need time to build confidence just to stand up and speak.

Both can become great teachers. But they don’t get there in the same way, or in the same time.

That’s where guided learning hours (GLH) start to feel out of touch.

GLH is meant to give structure. It tells awarding organisations and providers roughly how big a qualification is and how long it should take. In theory, that sounds reasonable. In practice, it assumes that learning is predictable.

It isn’t.

Anyone who’s spent time in a classroom will recognise that learning is far messier than that. The American educational theorist David Allen Kolb talked about learning as a cycle – doing something, reflecting on it, improving, then trying again. Some learners move through that quickly, others need time to process and build confidence. There isn’t a fixed pace.

The same applies when you look at Russian theorist Lev Vygotsky’s idea of needing the right level of support to move forward. Some learners just need a light touch, others need more guidance. That support changes the time it takes to learn.

And when you’re working with adults, American theorist Malcolm Knowles’s view is even more relevant. Adults bring experience, want learning to feel useful, and often take ownership of how they develop. That doesn’t sit comfortably with a system that assumes everyone needs the same number of hours.

You see all of this play out when delivering the AET.

Some learners grasp planning and delivery quickly. Their micro-teach is strong from the start. Others need a few attempts, not because they’re not capable, but because teaching is a skill that develops through doing it, getting feedback, and trying again.

That’s the whole point of the course.

But GLH doesn’t really allow for that difference. It quietly assumes everyone is moving at the same pace, which just isn’t the reality.

The bigger issue is how it’s used.

In some cases, there’s a heavy focus from awarding organisations on making sure learners complete a set number of hours. It becomes about ticking off time rather than asking a much more important question: can this person actually teach?

That shift matters. Because once you start prioritising hours, the learning experience changes. You end up shaping delivery around the clock rather than around the learner. People who could move faster are held back. People who need more time can feel like they’re falling behind.

Neither situation reflects what good training should look like.

It becomes even more obvious when you look at how people learn now.

A lot of development doesn’t happen in a classroom or a live session. Learners read articles, watch videos, join webinars, and talk to other professionals. The most valuable learning comes from those moments where something clicks after a conversation or a bit of reflection.

The problem is, none of that fits neatly into GLH.

How do you track the time someone spends reading a journal? Or having a discussion with another trainer online? You can’t, at least not in any meaningful way. So it often gets ignored, even though it’s a big part of how people actually develop.

That’s where the system starts to feel disconnected from reality.

The AET in particular is meant to be about the individual. It’s about helping someone find their own way of teaching, building confidence, and developing real skills they can use in front of a group.

It’s not about completing a set number of hours.

GLH still has a place. It helps give a rough structure and stops qualifications from becoming too loose. But it should be treated as a guide, not a rulebook.

What matters more is whether the learner can do the job at the end of it.

In FE, we talk a lot about learner-centred approaches. The AET is one of the clearest examples of where that should apply. If the focus shifts too far towards hours and away from outcomes, we risk losing what the qualification is actually there to do.

And that’s develop confident, capable teachers.

Because in the end, no learner remembers how many hours they spent on a course. They remember whether they felt ready to stand up and teach.

Engineering’s entry point is disappearing

Between 2017 and 2024 there was a 25 per cent reduction in the number of engineering apprenticeship starts in England. Underneath this headline statistic is the potentially more worrying one that in the same period, level 2 engineering apprenticeships starts fell by over 50 per cent. The High Value Manufacturing Catapult has been exploring this further by speaking to employers, providers, and young people who are considering their next steps. Against the backdrop of policy that is seeking to drive up apprenticeship opportunities for young people, we need to ask whether level 2 apprenticeships remain viable in engineering.

Apprenticeships are a hugely attractive progression route for young people, with employers often receiving many times more applications than they have positions for. However, there are limited numbers of entry level opportunities and employers tell us that these need to be balanced against their experienced workforce, to ensure they can continue with their core business as well as training apprentices. Just because policy is driving more funding towards lower-level apprenticeships, it doesn’t mean that employers will have the capacity to take on more entry level apprentices.

Our research further showed that employers are concerned about the risks posed by entry level apprentices. Government data shows that level 2 apprentices currently have a completion rate of 64 per cent, whereas those on higher level apprenticeships are much more likely to complete successfully. Taking on higher level apprentices is much less risky for employers, and this is shown by the net increase in all levels of engineering apprenticeships other than level 2.

The standards themselves are increasingly becoming an issue for both employers and providers. Our research suggested that the content of level 2 apprenticeships is often not technically complex enough to give apprentices the skills they need for roles in modern engineering workplaces. However, some of the hand skills delivered are still valued. Providers discussed that it can often be expensive to deliver level 2 apprenticeships due to increased technical expectations and funding restrictions. One provider we spoke to explained how they roll level 2 and 3 apprenticeships together to provide a balance of hand and higher-level technical skills.

The question remained as to whether the reduction of entry level apprenticeships in engineering is an issue or whether it is symptomatic of changing technical competencies in the workforce. For example, the original battery skills framework, published in 2021, made use of the lean manufacturing operator standard at level 2 for production line staff in gigafactories. But Workforce Foresighting data published in 2025 suggested that these roles were no longer relevant and that the base level of qualification should be level 3.

Having level 3 as an industry entry point would undoubtedly freeze many young people out of engineering opportunities and would be detrimental to the provision of good career entry and progression routes. There needs to be a compromise to address these issues.

One option would be to re-balance engineering apprenticeships in light of changing industry expectations. The use of academic levels has long been unhelpful when describing apprenticeships and addressing this through newly focused entry, intermediate, advanced and higher engineering apprenticeships would allow for greater relevance to employers and enable progression pathways to be clearer. Apprenticeships are not good at providing progression. And as they are focused in skilling someone for a role, you could argue they don’t need to be. However, showing clear progression routes from entry to higher level opportunities, with newly aligned capability outcomes, would better enable career development to happen.

Entry level apprenticeship roles in engineering are disappearing for a wide range of reasons and simply forcing funding for the training component back towards them is not enough. Employers need to see the value of entry level opportunities and a rapid route to competence in the workplace, and learners need to see that they have opportunities to gain a foothold in an engineering career.

The use of the statistics alone is too blunt an instrument to use to make decisions. There needs to be a fundamental and systemic change to secure the future engineering workforce.

‘Experts at hand’ cash must not plug ‘existing gaps’, councils told

Funding for a new scheme aimed at bolstering external support for young people with SEND must not be used to “fill existing gaps or replace current provision”, councils have been warned.

Town halls will also be forbidden from spending the cash on support named in children and young people’s existing education health care plans (EHCPs) or wider family support.

Leaders will also be expected to devise an approach that ensures support is not “disproportionately accessed” by the “most proactive schools and settings and includes out of area mainstream further education settings attended by local young people with SEND”.

As part of its white paper reforms, the government announced the creation of a new “experts at hand” service, backed with £1.8 billion in funding over three years.

The service aims to boost availability of external support. Schools and FE colleges can then draw from a pool of education and health professionals to fix the current “inconsistent and limited access” to their services.

The Department for Education has now published guidance on how the funding and an additional £200 million “transformation” pot will be allocated and how it must be spent.

Funding can’t cover support named in existing EHCPs

Councils will split £429 million this financial year and could impact nearly 390,000 16 to 19-year-olds with low prior attainment across the country.

This grant will “first and predominantly” provide cash for councils to work with integrated care boards (ICBs) to “develop and deliver a new EAH offer for mainstream education settings”.

However, the grant will also fund the administrative costs for local authorities associated with “evaluating their existing SEND support services to mainstream settings” and “developing and submitting local SEND reform plans”.

According to the document, at least 80 per cent of the cash “must be spent on EAH direct delivery for all settings, staff and their children and young people”.

No more than 10 per cent can be spent on administration costs for councils’ EAH offers, and no more than 10 per cent can be spent on “local authority transformation costs, including staff or other associated costs”.

But no funding can be used to support named in young people’s existing education health care plans (EHCPs), to make provision “schools can or should make themselves” or for the assessment for EHCPs.

“This funding is not intended to fill existing gaps or replace current provision, including traded services. The EAH offer should build on and enhance existing local capacity and good practice.”

‘Tilt provision to mainstream’

The guidance further sets out the government’s vision for the service, telling councils to “ensure that the EAH investment benefits all children and young people aged 0 to 25”.

Local areas should “also consider how they will develop this offer over time to ensure there is support and appropriate provision available across early years, primary, secondary, and FE settings”.

And councils have also been told to “start tilting local provision to focus on early support for mainstream education settings, so that staff are able to meet the needs of children more quickly and effectively within the setting”.

This approach “means mainstream education settings having access to expert professionals (both health and specialist education professionals) who can provide whole setting support, tailored guidance and strategic advice, as well as some group level interventions”.

The offer “should be additional to existing statutory and 1:1 support”.

No disproportionate support for out of area mainstream FE

The DfE has also said today that councils’ SEND reform plans must include a “proposed approach to settings accessing support which ensures support is not disproportionately accessed by the most proactive schools and settings and includes out of area mainstream further education settings attended by local young people with SEND”.

The grant also includes funding to establish new speech and language therapist advanced practitioners in every ICB geographical area, the DfE said.

It will also support “local reform and work with universities, education settings and local speech and language services to get more speech and language therapists working directly with children and young people”.

Councils will have to assure the DfE that funding is spent in line with the guidance.

It comes after the government pledged to write off 90 per cent of town hall SEND deficits. It will also take on the cost pressures in the system from 2028.

In its guidance today, the DfE said future support for deficits that arise between 2026 and 2028 “will take into account local authorities successful delivery of their approved local SEND reform plan, including appropriate use of investment to establish an EAH offer”.

‘Stepping-stone’ GCSEs risk halting social mobility progress, SMC report warns

Plans to introduce new “stepping-stone” qualifications for young people who fail GCSE English and maths risk creating a “stumbling block” for disadvantaged students, a Social Mobility Commission think-piece has warned.

The paper, authored by former Department for Education resits lead Andrew Otty, claimed the proposed level 1 “preparation” courses for resitters will trap low-attaining students at the same level they already reached at school.

The current condition of funding policy, introduced in 2014, forces 16 to 19 year olds without a grade 4 in English and maths to continue studying the subjects. It is often criticised by colleges for driving endless resits, but Otty’s report argued the policy is producing social mobility gains.

“The 16 to 19 resit sector is currently the only educational stage where disadvantaged learners are actively catching up to their non-disadvantaged peers,” the report claimed.

Since the policy’s introduction, more than 500,000 students have successfully retaken English and 350,000 maths.

Over 245,000 of those passes came from disadvantaged students.

Data shows that disadvantaged learners have improved at a faster rate than their better-off peers.

The percentage point change between 2016-17 and 2023-24 in students achieving a GCSE in English at age 19, after not doing so at 16, was 0.54 for disadvantaged students, compared with -1.2 for their non-disadvantaged peers. Meanwhile, the change in maths achievement was 1.04 percentage points for disadvantaged students, compared with 0.08 for non-disadvantaged students.

Source: SMC

Otty’s report said this success is in stark contrast to primary and secondary education, where recent years have seen disadvantaged gaps grow.

Low overall success rates do, however, persist. The proportion of learners passing their English and maths resit was on a consistent upward trajectory until the teacher-assessed grades of 2021-22. Pass rates peaked at 35 per cent for English and 31.1 per cent for maths that year due to “more borderline attainers having been awarded grade 4 in the more generous grading” during the Covid period, the report said.

Since then, the percentage of pupils who achieved a grade 4 at age 19, having failed to do so at age 16, has fallen and hit 20.6 per cent for English and 13.5 per cent for maths in 2023-24.

Despite this, the proportion of those achieving grade 4 in English and maths in 2023-24 remains 12.7 and 5.6 percentage points higher, respectively, than in 2013-14 – the year before the policy was introduced.

The government’s plan, outlined in last year’s skills white paper and out for consultation until June 2, to increase pass rates and reduce repeated exam failure is new level 1 English and maths “preparation” qualifications for those who scored grade 2 or below. But officials have been warned the policy risks doing the opposite.

Under current regulations, students entering post-16 education with a grade 3 must retake the GCSE. Those with a grade 2 and below may work towards either a GCSE or a functional skills qualification.

Because a grade 2 already represents level 1 attainment, lower attaining students would effectively be asked to repeat the same level before being allowed to attempt a GCSE again if the “stepping-stone” courses are introduced.

The report said: “Rather than acting as a helpful ‘stepping stone’, this creates a stumbling block, impeding the progress of the students most in need of support.”

There are also fears of a two-tier system, with higher-attaining students pursuing GCSEs while their peers are channelled into lower-status alternatives.

“While the post-16 sector has successfully narrowed the disadvantage gap, these reforms threaten to institutionalise low expectations and dismantle a decade of hard-won progress for the most vulnerable learners,” the report said.

The report pointed to evidence that students themselves prefer GCSEs, seeing them as more valuable in the labour market and for further study. Diverting them onto alternative qualifications risks damaging motivation as well as outcomes.

Otty, who is also a former further education English resit teacher, told FE Week: “The condition of funding is the only education policy that actually works in closing the disadvantage gap. The proposed new stumbling block qualifications are an act of sabotage from the enemies of social mobility.”

His report said that rather than going through a process of structural overhaul, the government should “instead build on what is already working.

“This includes mandating a minimum number of resit hours, providing more structural investment for 16 to 19 English and maths resits, and capturing and disseminating the teaching practices of the top performing colleges.”

Alun Francis, chair of the Social Mobility Commission and chief executive of Blackpool and the Fylde College, said: “We are publishing this think piece to provoke debate. Many practitioners will welcome the view that a new qualification is not going to answer the problem of English and maths achievement. The FE sector is weary of persistent curriculum reform.

“There is a clear case for breaking the cycle of short-term cramming combined with endless resits. But it is a moot point whether a new qualification will really make a substantial difference.”

The DfE was approached for comment.

For the first time in forever – fund SEND properly in colleges

At Bradford College, 93 per cent of our students come from areas ranked in the two most deprived bands in England according to the Index of Multiple Deprivation, the poorest 40 per cent of communities in the country.

One in five tells us they have a disability. Many more have needs that haven’t been formally diagnosed. We also teach unaccompanied asylum-seeking young people who are learning English while trying to build a life in a new country.

When SEND is discussed nationally, those realities rarely make it into the room. In colleges serving communities like ours, they shape everything.

I’ve worked in further education for over 20 years and my current role as assistant principal for students means overseeing safeguarding, wellbeing, careers and disability services alongside teaching and assessment. Across all of that, one thing has become clear: students are arriving less ready for adulthood than they used to be.

I don’t say that to criticise young people. The world they’re growing up in has changed dramatically. Economic pressure on families plays a role. Many parents are working flat out simply to keep things going. Students entering FE now also experienced the disruption of Covid during critical years of their education. Social media has also changed the pressures young people face before they even walk through our doors.

When I was younger, if you were in your bedroom, you were safe from the outside world. That isn’t the case anymore. A young person can be bullied or see violence in their bedroom. They can experience the weight of the wider world through their phone before they’ve even started the day.

Then they arrive at college and we expect them to be ready to learn.

Quite a lot of our work is about helping students reach that point first. Staff spend time supporting learners to regulate emotions, build confidence and feel that they belong in education. Without that foundation, qualifications won’t change very much.

One thing I say often is that learning support should be built in, not bolt on. The first person supporting a student should always be their teacher. Inclusive teaching makes a huge difference to whether learners feel able to take part. The sooner staff get to know their students; the sooner they can build the relationships that help young people attend and stay engaged.

Independence matters just as much. In the past the sector often relied heavily on placing learning assistants beside students throughout lessons, almost ‘Velcroed’ to their side. That can help in the short term, but it also creates dependence. Students need to develop the skills and confidence to manage their own learning.

At Bradford we have specialists who assess students and train them to use assistive technology such as dictation software, reading tools and applications that help with organisation. Students learn how to use those tools properly so they can rely on something they control themselves, in college and beyond.

Even with strong support in place, the wider system still creates serious obstacles.

High needs place funding hasn’t been updated since 2014. That was the year after Disney’s Frozen came out. Six years later the film got a sequel – high needs funding is still waiting for one.

Meanwhile colleges are welcoming students with increasingly complex needs. Staff across the sector are doing extraordinary work, but the pressure on resources is becoming unsustainable. Something has to give, and too often it’s the students who pay the price.

Then there’s the cliff edge at 19. Our funding is tied to qualifications, not to young people. Just as a student begins to build real confidence and resilience, the money runs out. What we actually need is the flexibility to meet students where they are and support them for as long as they need, rather than rushing them through a qualification and out into a world they aren’t yet ready for.

The current SEND reform consultation contains some genuinely encouraging ideas. Having built-in support within mainstream settings so young people can attend their local college is the right direction.

But warm words aren’t enough. Reform will only succeed if the resources match the ambition.

So, ministers… for the first time in forever, fund SEND properly.

 

We don’t need new apprenticeship metrics, we need to use the ones we have

The release of the latest apprenticeship performance data should underline a straightforward point for both the sector and employers: training outcomes still vary widely between providers.

Yet much of the conversation in the sector focuses on how we should define and measure apprenticeship success, especially as we expand into areas like AI. Those conversations should continue. But before we redesign how we measure success, we should probably start by making consistent use of the performance data we already publish.

Completion data is one of the clearest indicators of whether apprenticeship training has actually delivered in practice. When learners reach the end of a programme, employers are far more likely to see the skills, retention and long-term workforce value they invested in. In that sense, completion is not just an education outcome; it is a return-on-investment indicator for employers.

In the education and training sector, qualification achievement rates (QAR) remain an industry standard measure of apprenticeship quality because they answer a basic but important question: how many learners actually complete the programme they started?

But when you step outside the echo chamber of our industry, awareness of QAR remains limited. Many employers don’t know where to find QAR data, how to compare it across providers, or how much weight to give it when choosing a training partner. As such, it remains far less visible and usable in the world where apprenticeship decisions are actually being made.

This becomes even more critical against the backdrop of a slower hiring market where businesses often become more selective. In response, apprenticeships are increasingly treated as long-term workforce investments; employers want reliable delivery and to know that the training they back will result in completed programmes, skills and real return.

Completion sits at the heart of that. Enrolments and course starts matter, but if a learner does not reach the end of the programme, the value of the investment looks very different.

This is where the sector needs to be more direct. We already have a baseline measure of quality, and we should be making far better use of it before we rush to dilute the conversation with any new metrics.

DfE already publishes detailed figures each year and has taken welcome steps to improve accessibility through its dashboard. But to date, there has been no simple route for an employer trying to answer an entirely practical and totally understandable question: ‘Which providers consistently get learners through to completion in the area I want to invest in?’

Instead, employers are often left navigating large data tables, interpreting dense terminology and trying to build their own comparisons from raw information. For an SME owner with limited resource trying to make a critical hiring decision, or for an already stretched HR team managing multiple recruitment and retention programmes, that is more friction than there should be around such a basic question, and it has consequences.

This is probably why the conversation in the sector keeps returning to new ways of measuring apprenticeship success. When the most established performance data is difficult for employers to access, interpret and compare, it’s easy to assume the problem is the metric itself rather than how visible and usable it is. But new measures will not help employers make better decisions if the existing ones remain hard to use.

Poor usability makes weak outcomes easier to miss. Employers can choose providers without a clear view of delivery performance. Learners can enter programmes with lower chances of completion. Levy-funded investment can flow without enough practical visibility of likely results.

In response to this problem, we’ve built additional public resources to make this data much easier to compare. The intention is not to create new league tables or to reduce apprenticeship quality to a single number, but to make the information that already exists on QAR more visible and more usable for the employers who are expected to rely on it.

QAR remains one of the few measures that shows, at scale and objectively, whether learners complete the programmes they begin. Before the sector gets too eager to embrace newer or broader measures, it should first make sure that the most established one is visible and usable to the employers who rely on it. Apprenticeship quality data is not the problem; leaving already stretched employers to do too much of the work is.