Skip to content
9 April 2026

Latest news from FE Week

Ofsted’s five-day rule is only the beginning

During Ofsted’s Big Listen, FIN campaigned strongly for a consistent five-day notice period for all provider types prior to inspection, not just for larger or more complex organisations. This reform has reshaped inspection preparation. The extended notice window, now widely referred to as the planning week, is proving to be a positive development. But it has not yet been consistently applied by inspectors.

Because FIN supports members before, during and after inspection, we are observing how different lead inspectors are interpreting and utilising the five days. Providers are typically offered three planning calls.

The first call, held shortly after notification, focuses on confirming provision details, reviewing QAR (qualification achievement rates) and discussing the data held by Ofsted. Increasingly, inspectors reference Further Education and Skills Inspection Tool (FESIT) data at this early stage. The data helps inspectors understand provider performance, shape conversations during pre-inspection calls and identify focus areas for inspection. It also includes the ‘schools disadvantage’ indicator which primarily identifies students or apprentices who may face socio-economic barriers to learning.

FIN members routinely pass on inspectors’ names to us after the initial meeting. Drawing on our inspection intelligence database, developed through years of direct inspection support, we can provide nominees with contextual insight into inspectors’ previous inspection activity under both the former and revised frameworks.

While recognising every provider is different and Ofsted takes each provider’s context into account, this information can be invaluable in identifying trends and patterns. This enables leaders to anticipate lines of inquiry and understand professional tendencies, thereby strengthening preparation without compromising authenticity.

Our experience across multiple live inspections shows variation in how the second and third planning calls are conducted. Some lead inspectors separate them; others combine them later in the week. This flexibility can work to a provider’s advantage, but only if used strategically. And be clear that the planning week is not administrative breathing space; it is part of the inspection evidence base.

Cross-provider analysis is allowing us to identify patterns early, translate them into practical guidance and support nominees in navigating the nuances of planning week decisions. Several themes are emerging:

Data accuracy matters

 FESIT information should be accessed, scrutinised and reviewed regularly before the planning week. Errors do occur and correcting them requires time. Providers, who leave this unchecked until inspection risk discussions shaped around inaccurate assumptions.

Scheduling is strategic

Inspection remains a snapshot. Decisions about which learners, employers and sites are visited influence the evidence inspectors gather. Shift patterns, personal protective equipment requirements and safeguarding considerations require careful coordination. A mid-week planning discussion during the five days often enables providers to confirm availability and propose appropriate alternatives where necessary. FIN has seen stronger inspection weeks where nominees have confidently shaped this dialogue.

Inclusion must be evidenced with rigour

 It is now commonly understood that the revised framework places heightened emphasis on inclusion. Many providers deliver exceptional support; however, inspectors are probing beyond narrative. They seek structured identification processes, measurable impact and systematic evaluation. Through inspection support work, FIN has observed that providers who align FESIT indicators, internal tracking and intervention records present a far more coherent story.

Know your USP

 In planning week, a provider must be confident about the context in which they operate and why they want to be there. They should highlight to inspectors the communities and employers they serve.

 It is also evident that inspectors themselves are adapting to the secure-fit model. Variability in approach reflects this transition. In this context, nominees should feel professionally confident in asking questions, clarifying expectations and requesting adjustments that safeguard learner experience. Constructive challenge strengthens inspection integrity.

Crucially, providers should reflect well in advance: if the inspection were to take place next week, who would you want inspectors to see, and why? Which curriculum areas best exemplify intent and impact? While the final schedule rests with the lead inspector, many are open to well-reasoned suggestions. Without prior strategic thought, opportunities to showcase strengths may be lost.

The five-day notice period was a hard-won reform. Its value lies not in the extra days alone, but in how effectively they are used. A purposeful, well-informed planning week establishes the conditions for a focused and productive inspection – one that accurately reflects the quality, integrity and ambition of a provider’s work.

Colleges are losing students in the system

Every September, colleges register their enrolment numbers and someone, somewhere, is quietly disappointed. Not catastrophically – just a few dozen short of forecast. The same the year before. And the year before that.

The assumption is usually that demand was not there. But increasingly, I suspect the problem is different: the students applied, and then disappeared.

I have spent the last two years speaking to further education colleges about their student recruitment systems – listening to teams who are tracking hundreds of live applications across shared spreadsheets, chasing responses by hand, trying to piece together a picture of where things stand from inboxes and printouts. What I have found is an uncomfortable truth: FE recruitment is running on infrastructure that was never designed for the volume or complexity it now faces.

And the timing could not be more critical. The 16 to 18 population surge that caught the college system unprepared is already is expected to continue rising until 2028, when the number of 16- to 18-year-olds is expected to peak.  At the same time, DfE needs T Level enrolments to more than double, with just over 27,000 starts this year already falling short of revised targets, and an ambition of 66,100 by 2029.

The skills white paper spelt out plans that young people who leave school at 16 without an education or training place will be auto-enrolled into provision. I believe this will be challenging to achieve under current registration systems.

FE recently found that councils blamed “data recording pressures” for statistics that suggest thousands of teenagers are not being offered a “guaranteed” place in post-16 education each year after leaving school without a plan.

And FE Week also recently highlighted how the lagged funding model means colleges must absorb the cost of surging student numbers upfront, with no guarantee of in-year growth funding to cover it. Every applicant lost to a broken process is one they cannot afford to lose.

Unlike secondary school recruitment – where local authorities coordinate a digital, trackable process – and unlike higher education, which has UCAS as a central nervous system, FE sits awkwardly in between. Colleges are largely on their own, and what that looks like in practice is a patchwork of disconnected systems that were never designed to talk to each other. An email platform here. An events tool that doesn’t talk to the CRM. An application form that feeds into a spreadsheet. Open days promoted through one system, follow ups and offers sent through another. At every handover between systems – and there are many – there is a gap where a student can quietly disappear. And because no single tool holds the full picture, nobody sees it happen.

This is not a criticism of the people doing the work. Recruitment teams in FE colleges are often small, under-resourced and working incredibly hard. The problem is structural. There is no common standard, no shared visibility, and almost no usable data. If you cannot see where students are dropping out of your recruitment funnel, you cannot fix it. You do not know whether your offer acceptance rate is 60 per cent or 90 per cent. You do not know whether certain programmes are consistently losing students at a higher rate than others. You are, in many cases, flying blind.

FE is the missing partner in social prescribing

On Social Prescribing Day today, the national conversations focus on the vital role of the voluntary and community sector in supporting health and wellbeing. But there’s a critical gap that risks limiting its impact, just as the NHS seeks to redefine prevention.

The government’s health mission aims to create a fairer Britain where people live well for longer, recognising that health is shaped by more than clinical care. Tackling inequalities requires collaboration across sectors, including organisations like WM College, London embedded in the communities most affected.

Social prescribing reflects this shift.

It recognises that loneliness, low confidence, poor mental health and economic inactivity cannot be solved through medical intervention alone. They require sustained, community-based responses that rebuild confidence, connection and purpose.

FE colleges already deliver this every day, at scale. At WM College, we see this first-hand: adults referred through wellbeing networks rebuilding confidence through creative learning, often taking their first steps back into structured activity after periods of isolation.

So why do colleges remain largely absent from the systems designed to deliver health and wellbeing?

At a time of unprecedented NHS pressure, this is no longer a minor oversight.

The NHS 10-year plan places prevention and neighbourhood health at its core, with Integrated Care Systems (ICSs) tasked with shifting from reactive care to earlier intervention and joined-up provision.

But intent alone will not deliver this shift.

Social prescribing pathways rely heavily on the voluntary sector which is vital, but often constrained by short-term funding. If social prescribing is to operate at scale, the system must draw on existing infrastructure.

Colleges are exactly that.

They are anchor institutions with the space, workforce and reach to deliver high-quality provision – already engaging many of the individuals social prescribing is designed to support.

For many, entering a college is not seen as a health intervention, but as a positive step forward. That distinction reduces stigma and enables re-engagement – particularly important given rising economic inactivity.

Social prescribing helps stabilise wellbeing. But too often it lacks a clear onward pathway, meaning impact can plateau.

FE provides the missing link connecting wellbeing to skills, confidence and employment.
NHS England recognises that people with little or no English are more likely to experience poorer health outcomes. Only 65 per cent of people who could not speak English reported good health in the 2011 census, compared to 88 per cent of those who spoke English well. They’re more likely to face healthcare inequalities, including significant barriers and delays in receiving care.

ESOL classes do more than teach language – they build belonging, reduce isolation and prevent mental ill-health. Yet this impact is rarely funded through social prescribing.

As Integrated Care Boards shift toward strategic commissioning for population health outcomes, excluding FE is a failure to align delivery with policy intent.

Where partnerships do exist, they’re often fragile – reliant on individuals and vulnerable to bureaucracy and staff turnover – rather than embedded in system design.

Integrating FE fully into social prescribing  would mean recognising colleges as core partners within neighbourhood health models, embedding them in commissioning frameworks and aligning health, skills and employment funding to support joined-up outcomes.

Funding remains a critical barrier. When patients cannot afford medicines, the NHS subsidises them. The same principle should apply to social prescriptions. If a GP refers someone to a wellbeing activity, colleges are often expected to provide it free, absorbing the cost. A prescription is a prescription – and it should be funded as such.

The need for preventative support is growing, and national conversations – including the ongoing inquiry by the All-Party Parliamentary Group for Further Education and Lifelong Learning – increasingly recognise the role of adult education in driving health, wellbeing and economic resilience.

These agendas are converging. But policy has not yet caught up with practice.

If we fail to act, we risk maintaining a fragmented system where health, education and employment operate in parallel – missing a critical opportunity for individuals, communities and the public purse.

Further education is not simply a pathway to skills. It is part of the UK’s health infrastructure.

And it is time our systems were designed to reflect that.

A night in the cells set students free to play and make connections

When my students arrived at Shrewsbury Prison for our overnight residential, they were singing and dancing on the coach.

An hour later they were in prison jumpsuits, choosing their cells for the night.

Each cell contained nothing but a metal bunk. The heavy doors clanged shut. The reality of the environment settled in.

The aim of the trip was simple: to bring criminology and psychology to life. Students took part in escape-room challenges across A Wing, explored the history of punishment through a guided tour, and debated the realities of prison life in the same corridors where inmates once walked.

But the most interesting learning moment of the trip did not happen during the formal activities. It happened later that evening when the structure faded away.

Learners from different classes, many of whom had barely spoken to one another before, began playing games. Hide and seek echoed down Victorian corridors. Small groups gathered on landings talking and laughing together in what could only be described as impromptu circle time.

At first glance, it looked like chaos. In reality, it was something much more valuable: play.

Resilience and independence

In education we often associate play with early years. Somewhere around the age of 11 we quietly remove it from the learning environment and replace it with seriousness, structure and assessment. But teenagers still need play.

Psychologist Peter Gray has long argued that play is one of the primary ways young people develop social competence, resilience and independence. Through playful interaction, young people practise negotiation, cooperation and emotional regulation in ways that structured activities rarely replicate.

In other words, play is not a distraction from development, it is part of the mechanism that drives it. What I witnessed in that prison wing was exactly that process happening.

Students who normally sit in separate classrooms were suddenly mixing naturally. Conversations were flowing. New friendships were forming. Students who can sometimes be quiet or withdrawn in lessons were confidently participating in group games. The environment had changed, and with it the dynamics between students.

A prison, a space historically designed for isolation and control, had unexpectedly become a place where connection and community flourished.

Of course, the trip had taken weeks of planning, risk assessments and organisation. Experiential learning always does. But moments like that remind me why it matters.

Further education students are navigating one of the most complex periods of their lives. Many are managing anxiety, uncertainty about the future and the pressures of adulthood arriving fast. In that context, opportunities for genuine social connection are not just enjoyable, they are developmentally important.

Play allows teenagers to experiment socially without the pressure of performance or assessment. It gives them a space to negotiate friendships, develop confidence and regulate emotions in ways that traditional classroom environments often struggle to facilitate. And perhaps most importantly, it reminds them that learning environments can be human.

Bringing people together

When we reflected on the trip afterwards, several learners said they would happily do it again.

What stood out was how aware they were of the way the evening had unfolded socially. Some even complimented classmates for starting the games that brought everyone together.

What began as hide and seek soon evolved into sardines, with students squeezing into the same hiding places and encouraging others to join in.

Perhaps most importantly, some of the younger learners said they now felt they had someone in the year above they could approach if they needed support. In a single evening, students who had barely spoken before had created connections that may last far beyond the trip itself.

As educators we often talk about engagement strategies, retention and wellbeing initiatives. Sometimes the solution is simpler than we think.

Give young people a space where they feel safe enough to laugh, explore and play together and learning will follow.

That night in Shrewsbury Prison certainly proved it.

Colleges could pay teachers more but are choosing not to

The pay gap between school and college teachers has reached its widest level in 15 years.

But we rarely hear of similar pay gaps for college CEOs, senior leaders, professional staff or other college employees.

Do we pay teaching staff less simply because we can? And could we do something about it if we wished?

In addition to funding, there are four reasons for the gap:

School teachers are graduates, college teachers often aren’t

Almost every schoolteacher is a graduate, compared to about half of college teachers. The UK graduate earnings premium has fallen sharply but is still about 45 per cent. Colleges sell higher education courses as a way of increasing earnings, so surely we would expect schoolteachers to earn more? 

Sixth form college teaching is a graduate profession and a comparison of staff costs with general FE colleges is instructive. 

The audited finance record for 2023-24 shows median sixth form college teacher pay (including pensions) at £64,128 is 25 per cent higher than for general FE colleges. There is also less variability in sixth form college figures, suggesting they match schoolteacher salaries.

Type of collegeMedian teacher pay cost/teacher FTEUpper quartile teacher pay cost/teacher FTE
Sixth form colleges£64,128£66,471
General FE colleges£51,163£55,372
Land-based colleges£49,185£55,943

College teachers have fewer options to increase pay

While sixth form college teachers could all work in schools, that is not true of college teachers. However, it is argued college teachers can always go back to industry. 

But the latest Office for National Statistics data shows FE teaching pays better than equivalent roles in leisure, sport, business, arts, hair and beauty, and even some areas of construction.  Only areas like finance, engineering and IT offer a better financial deal.

Colleges do not discriminate in favour of teachers like schools do

Colleges are scrupulously fair when it comes to pay rises. Our negotiating machinery treats all college staff (other than senior leaders) in the same way.

But the government treats schoolteachers as a special case with their own pay review body.  Unsurprisingly, they get bigger pay awards and better progression.

Colleges divert resources to non-teaching activity

In recent years there’s been an uplift in employer and business development-related activity in colleges, including things like sponsorship of events. Our most expensive staff spend much of their time doing this. 

But the funding model of schools and colleges is identical – funding follows the learner. Trust CEOs, school heads and sixth form college leaders don’t spend as much time on such activities. 

And the latest finance record shows this increase in employer activity has been accompanied by drops in related student numbers in the last three years.

HE numbers are down 30 per cent, 16-18 apprentices fell 3 per cent, adult apprentices dropped 20 per cent and adult students declined by 1 per cent. 

In contrast, the boom in 16-18 student numbers means colleges increasingly resemble schools in terms of their student population and levels of study. This presents real questions about the effectiveness of employer engagement.

To close the gap, here are some suggestions for an FE teacher pay policy:

  • Take teachers out of national negotiations and commit to at least matching the rises given to schoolteachers, and to senior teams, even if that leaves less available for other staff.
  • Use audited finance record data for average FTE teacher cost (which is surely a more important league table than CEO pay) to put pressure on colleges well adrift from the median.
  • Push governing bodies and senior teams to scrutinise their spending on employer engagement and sponsorship activities to ensure it’s worth the investment.

If we aren’t that serious about this issue, then let’s be honest with our teachers and say we’ll let government funding and the labour market dictate salary levels instead.

‘Computer says no’ is a huge concern among young people

AI is increasingly shaping how recruitment works. CharityJob’s latest research shows young people are particularly concerned about its impact on fairness, transparency and access to opportunity.

Somewhere between clicking ‘apply’ and receiving a rejection, a growing number of people are disappearing from the recruitment process altogether. There’s no feedback. No interview. No opportunity to explain who they are or what they could become.

CharityJob’s latest research into AI and recruitment raises some uncomfortable questions about how hiring is changing. Artificial intelligence is often framed as a tool that makes recruitment faster and fairer. In reality, for many young jobseekers, I often wonder, could it be doing the opposite?

Young people are the most uneasy about AI’s growing role in recruitment. Those aged 24 and under are consistently more concerned than older candidates about its impact on job opportunities and fairness. They are also the most likely to say they would rather a recruiter reviewed their application than an algorithm.

That matters enormously for young people who are not in education, employment or training (NEET). This group includes 16 to 24-year-olds navigating disrupted education, caring responsibilities, health challenges or periods of unemployment. Many are actively engaging with further education, training providers and employability programmes to get back on track. There is a risk that younger people, particularly those already facing disruption, feel this uncertainty most acutely.

Yet CharityJob’s research suggests even recruiters themselves are uneasy about this approach. Only around three in ten recruiters currently use AI in hiring. And just one in five say they trust AI’s recommendations. The majority oppose its use in final hiring decisions. The research also shows that nearly two thirds of candidates would feel disadvantaged if AI were used to screen their application, rising further among younger respondents. At the same time, almost seven in ten candidates say it has become harder to stand out as more applicants use AI to optimise CVs and cover letters.

For young jobseekers, this creates a double bind. They are encouraged to show motivation, transferable skills and individuality, yet are competing in a system that they perceive to be increasingly rewarding standardisation and pattern-matching.

Recruiters themselves are not convinced that AI improves fairness. Fewer than one in four believe it makes recruitment fairer. And confidence in its ability to reduce bias has fallen sharply over the past year. Concerns about transparency, overreliance and the risk of overlooking strong candidates remain widespread. 

According to our research, AI use appears to be focused on driving efficiencies in things like scheduling interviews and sending bulk responses, rather than at the early screening stage, where decisions matter most.

While AI use is highest among candidates aged 25-49, younger candidates under 24 stand out not for heavier usage, but for significantly higher levels of concern about AI’s impact on fairness and job opportunities. For under-24s, AI already feels like a threat rather than an opportunity: 86 per cent are concerned about its future, 91 per cent want a human recruiter, not an algorithm, reviewing their application, and 67 per cent believe AI is reducing job opportunities.  What they are asking for instead is transparency, proportionality and human judgement at the points that matter most.

Yet transparency is exactly where the system is currently falling short. More than nine in ten candidates believe recruiters should be open if they are using AI to assess applications, but most say they have never been told whether this is happening. On the employer side, the picture is just as concerning, with nearly eight in ten recruiters admitting they have no formal guidelines on the use of AI in recruitment and offering little or no guidance to candidates.

In the absence of clear policies, young jobseekers are left guessing. They don’t know when AI is being used, how decisions are made or whether their skills are being assessed fairly. If we are serious about widening access to work for young jobseekers, AI cannot be allowed to become another invisible barrier. Young people don’t need the odds stacked further against them. They need a fair shot and someone, preferably an actual human somewhere in the process, willing to actually look.

‘Meets expectations’ isn’t good enough if old mindset persists

When Ofsted confirmed the removal of the overall effectiveness grade, I welcomed the decision. For years, that single-word judgement dominated headlines, banners, leadership discussions and tender requirements.

It was a blunt instrument that consumed unnecessary deliberation time during inspection and likely skewed decision-making. Its absence should, in theory, take some of the heat out of inspection.

But removing the overall grade does not remove the culture that grew around it.

A new challenge is emerging as inspections under the revised framework take place: a branding problem with the grading scale.

Under the previous framework, the centre of the bell curve of inspection outcomes broadly sat above ‘good’. Sector shorthand evolved accordingly. Leaders aimed for at least ‘good’. Governors asked whether provision was ‘good’. Staff knew where they stood if feedback suggested they were working at a ‘good’ standard.

Under the revised framework, that centre sits above ‘meets expectations’. That is the structural shift Ofsted has made. Even so, more ‘needs attention’ grades are being awarded than ‘requires improvement’ were previously, as Ofsted has highlighted.

Yet in many organisations, I’ve noticed a translation happening. People mentally convert the new scale back into the old one. ‘Strong’ becomes the new ‘good’. ‘Meets expectations’ is interpreted as something less than acceptable.

That is not what the framework intends, and I’d argue it’s within our gift to address this rather than wait for an unlikely change in the framework.

Culture takes years to unravel

This matters because any inspection system brings both intended and unintended consequences. When grades were removed from teaching observations, the intention was to reflect extensive research showing both the damage caused by grading individual lessons and the lack of validity in judging teaching quality from a single observation.

Yet the culture built around grading lessons took years to unravel. Even where providers stopped using grades, teachers still asked the same question at the end of feedback conversations: “But what grade would that have been?”

The culture did not disappear simply because the system changed – something social scientists call cultural lag.

The revised inspection grades risk following the same path. Even without an overall effectiveness judgement, organisations can recreate the same pressure if they interpret the framework through the lens of the old system.

The phrase ‘meets expectations’ can sound modest. In everyday language it may imply adequacy rather than strength. But within the inspection model, it means something quite different.

It describes provision that is working as it should. Learners are benefiting. Systems are functioning. Standards are being delivered. While ambition for improvement is essential, it should not come at the expense of balance.

‘Meeting expectations’ can, and should, be regarded as success. If you have been through an inspection recently, you’ll know it’s not easy to hit every indicator in the toolkit, and I don’t think we should shy away from calling it a checklist with nuance.

Perhaps the issue, therefore, is not the wording itself but the narrative attached to it.

This matters particularly for governing boards or senior leaders further removed from the mechanics of Ofsted inspection, such as employer-providers or universities.

Understanding recalibration

Without clear explanation, some will understandably anchor the new scale to the old one. And that risks recreating the high-stakes pressure that the reforms intended to reduce.

Those furthest from the inspection process, yet accountable for outcomes, need support from leaders to understand the recalibration. The most common outcome now is ‘meets expectations’. That’s where the bell curve sits.

The revised framework’s success may depend less on Ofsted and more on how the sector chooses to use it. We can avoid the public flogging that poorer overall grades once triggered, normalise honest conversations where attention is needed, and celebrate meeting expectations as well as exceeding them.

Leaders have an opportunity to shape the narrative. That means being clear, internally and externally, about what the new grades represent. It means briefing governors and employers carefully.

The framework has shifted. The culture needs time to catch up.

We must measure what matters in this new era of AI upskilling

In the three years since generative AI entered the public consciousness, it has moved faster than any technological shift in our history. In 2025 the skills and education sector had to grapple with how to equip people for that shift, leading to a year of profound tension.

We were caught between an economy sprinting toward an AI-driven future and a regulatory system that moves at a more measured pace.

Skills England in its AI skills for the UK workforce report characterised it as “slow curriculum responsiveness to emerging AI tools and sector-specific needs”. 

The Department for Education and Skills England have now made admirable progress. By launching a dedicated level 4 AI apprenticeship standard, committing to a faster approvals process and signalling the start of shorter ‘apprenticeship units’ from next month, the government is paving the path that employers and providers have been walking for months. 

The key question now is how to measure the quality and impact of these skills programmes.

Existing measures that merely tell us whether a learner passes a programme do not adequately capture the value delivered. It does not tell us whether the government’s aims on AI skills have been delivered – nor does it demonstrate to employers the return on their investment.

AI’s rapid growth means we must keep pace with best practice in measuring successful outcomes, just as we’ve broadened the scope of what an apprenticeship can be. 

Bridging the innovation-regulation gap

By the time the dedicated AI and automation apprenticeship standard fully enters the market, it will have been 3.5 years since the launch of ChatGPT. 

UK businesses couldn’t wait that long. So providers innovated within the system we had. At Multiverse we integrated AI training into relevant existing standards, like business analyst.

Broadly, it worked: we’ve equipped thousands of people with the skills to harness this powerful technology. Those skills have had real-world impact: bringing down waiting lists in hospitals; offering charity support services to more people; and enabling small businesses to innovate at a fraction of the typical cost.

But businesses didn’t yet know exactly what AI skills they required and for whom; and not all of our assumptions on what would work came right.

Measuring what matters

Apprenticeships by nature require skills to be applied on the job. It’s not easy to capture the success of that only through an assessment at the end. 

That’s why we measure success in other ways too: things like costs avoided, revenue generated, issues solved for local residents, and better patient outcomes. And at a learner level, we track promotions and pay rises; nearly half of our apprentices secure a promotion.

Yet the qualification achievement rate (QAR) captures none of it. The primary measure of apprenticeship quality is still whether a learner crossed a finish line – not what they built along the way.

QAR is a lagging indicator. It measures against decisions made up to two years ago or more. In AI, two years may as well be 20.

If a learner gains the skills they need to secure a promotion and then moves into a new role before reaching an end-point assessment, the system records that as a failure of retention. But in reality it’s a triumph of social mobility and economic impact.

Better success metrics exist in other areas of education. The Higher Education Statistics Agency’s graduate outcomes survey, tracking salaries and career paths, is a great example: has your study enabled you to advance in your career and earn more?

We know training pays dividends. The Learning and Work Institute found how those who access training see a 15 per cent salary uplift across their lifetime compared to those who don’t. Why not measure the size of that prize?

The UK has the potential to lead the world in AI adoption, not least because of its world-class education systems. Our regulatory frameworks should incentivise innovation and impact. 

Only then will we move from surviving the AI transition to truly leading it.

Northampton colleges plan to merge next year

Two Northampton colleges are proposing to merge to offer local students a “wider range” of courses and strengthen their finances.

Northampton College and land-based Moulton College – which exited government intervention two years ago – are aiming to merge by January 2027, according to a joint statement today.

The colleges said merging into a single £70 million turnover group will improve local access to courses and open up progression routes that “neither organisation” could deliver alone.

They also promised to become a “more resilient organisation” that can respond to changes in policy, funding and local community needs.

Jason Lancaster, principal of Northampton College, said: “Exploring a merger gives us the opportunity to build an organisation that can meet these expectations and better serve our students and communities.”

An announcement on Moulton College’s website said governors at both colleges have now approved plans to “explore the benefits” of a merger.

It added: “A final decision will be made by the corporations of both colleges once this work is complete and all considerations have been carefully evaluated.

“There is still a long way to go but we are aiming towards January 2027 for completion.”

Public feedback is invited through an online form that allows questions or comments to be submitted.

Moulton College exited seven years of FE Commissioner intervention in 2024, after a turnaround that included selling land and the Department for Education refinancing £13 million in commercial debt.

Its most recent accounts, for 2024-25, show it ended the year with a surplus of £300,000 from a total income of £28 million. The colleges teaches around 4,000 students and employs 400 staff.

Moulton College principal Oliver Symons, who joined in 2024, said: “This is an exciting opportunity to bring together the strengths and expertise of both colleges.

“Our goal is to offer students more choice, clearer progression routes and improved access to specialist facilities. Employers will also benefit from a single, stronger partner that is responsive to local skills needs.”

Northampton College, a general FE college, currently has about 7,000 students and 640 staff.

The college ended 2024-25 with a surplus of £3.3 million on a total income of £45 million.

Northampton College is located in the eastern suburbs of Northampton, relatively close to Moulton College, which sits on the outskirts of the town.