Constant rule changes are undermining apprenticeship improvement

While reform is necessary, its relentless pace is undermining stability, damaging improvement and eroding trust, argues Lou Doyle

While reform is necessary, its relentless pace is undermining stability, damaging improvement and eroding trust, argues Lou Doyle

28 May 2025, 15:26

The Department for Education’s annual funding rule changes for apprenticeships have become a defining feature of the skills landscape. Intended to refine the system and raise standards, they now risk creating an unintended problem of their own: instability. 

This isn’t an argument against reform. It’s an argument for better reform. More specifically, recognising that constant change, without sufficient time or support to implement it, risks damaging the very quality we are all trying to protect.

Quality demands stability

In quality management theory, there’s a long-established warning against the risks of frequent system-level change. W. Edwards Deming, the American statistician whose thinking helped shape global manufacturing standards, called it “tampering”, making constant adjustments in the name of improvement, but without evidence or systems thinking to back it up. Instead of improving outcomes, this approach introduces variation, confusion, and operational noise.

That’s the pattern apprenticeship providers now face each year.

Revised rules demand updates to programme structures, funding claims, evidence requirements and employer and learner guidance. Each new rule might be sensible in isolation, but the cumulative effect is disruption and stress on those individuals who need to make the changes.

Even the best-prepared providers must divert time and resources away from teaching and learning to rework documentation, retrain staff, reprogram data systems and reassure employers. This reactive culture chips away at capacity for long-term improvement. It fosters risk aversion. It creates compliance habits, not quality habits.

More fundamentally, it raises a question of trust. 

If the rules change every 12 months – sometimes more frequently – how can providers build sustainable delivery models that learners and employers can rely on?

Lessons from other sectors

Other sectors offer cautionary tales about the cost of regular change. 

One of the starkest is the Mid Staffordshire NHS Foundation Trust Inquiry. While the causes were complex, the final Francis Report highlighted how frequent policy shifts, performance targets, and structural changes created a loss of clarity and accountability. Staff were overwhelmed, systems became misaligned, and—ultimately—care quality declined.

The aviation industry offers another instructive example. Over a three-year period, the U.S. Federal Aviation Administration introduced multiple software and procedural updates to its air traffic control systems. These weren’t inherently bad ideas, but because they were introduced too frequently and with limited integration planning, they triggered temporary breakdowns in service and control, as well as operator confusion. A review found that “change fatigue” had set in, even among experienced professionals.

And in the private sector, the 2010 Toyota recall crisis is often cited in business schools as a case study in how a drift away from disciplined, stable improvement practices (like Kaizen) contributed to systemic design issues and reputational damage. Toyota’s rush to innovate – without enough time to embed new processes properly – undermined the company’s long-standing reputation for quality.

These examples show a consistent pattern: change without system-level planning weakens performance.

Reforming the reform process

The current system of annual funding rule changes in apprenticeships increasingly mirrors this pattern. It’s not that the changes themselves are unhelpful—it’s that their frequency, timing, and cumulative effect disrupt the conditions necessary for quality delivery.

In many cases, providers are implementing new rule sets in September that were only finalised in May. There is little room for piloting or evaluating impact. No other publicly funded education programme -school or university – operates with this level of operational fluidity. 

If policymakers want providers to plan, invest, and deliver consistently high-quality apprenticeships, then we need to deliver the same level of planning and consistency at policy level. That includes longer lead-in times, clearer rationale for changes, and a more transparent impact review process.

Because right now, it’s not the apprenticeship delivery model that’s broken. It’s the process for changing it.

Latest education roles from

Principal & Chief Executive – Bath College

Principal & Chief Executive – Bath College

Dodd Partners

IT Technician

IT Technician

Harris Academy Morden

Teacher of Geography

Teacher of Geography

Harris Academy Orpington

Lecturer/Assessor in Electrical

Lecturer/Assessor in Electrical

South Gloucestershire and Stroud College

Director of Management Information Systems (MIS)

Director of Management Information Systems (MIS)

South Gloucestershire and Stroud College

Exams Assistant

Exams Assistant

Richmond and Hillcroft Adult & Community College

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *