Al isn’t just being used by students to cheat. When it comes to apprenticeships, there are many ways the tech can be deployed to boost learning – but are providers embracing it yet?
Imagine you’re an apprentice fully immersed in the latest AI tools. Your day might look like this:
After breakfast, a personalised task manager app tells you what you’ve got on that day, based on the data it has gathered about how you learn effectively.
After doing some research that involves putting prompts into a large language model (LLM) such as ChatGPT, you start an online module with an AI mentor – a chatbot with human-like qualities.
After lunch, you strap on your VR headset and work on a simulation project using AI-powered apps to calculate measurements and analyse data.
Then at 5pm, you return to your AI mentor to reflect on what you’ve learned and struggled with.
For some educators, this scenario represents a dystopian nightmare. One in which robots hijack our ability to think for ourselves and that makes traditional training redundant.
And there are fears that cash-strapped training providers could use chatbots to do away with human tutors, and that learners’ mental health will suffer from the lack of human interaction.
So far, the government’s lead technical training body, the Institute for Apprenticeships and Technical Education (IfATE), has only produced loose guidance around the use of AI, leaving many providers hesitant about its adoption.
But Jonathan Smith, co-founder of Veraxis, a company that supports training firms to embrace AI, believes they need to “catch up” but with the right policies to limit the pitfalls.
“This isn’t the future, these tools are here now,” he says. He explains many people working with apprentices already use AI, sometimes without realising it, for admin tasks such as attendance tracking and scheduling, and within learning management systems.
Tom Rogers, an early careers practitioner for BAE Systems, says AI is already being used by tutors to build lesson plans, and by applicants applying for apprenticeship vacancies.
Although he acknowledges “Gen Z are on their phones more than we would like”, he believes young people are “often looking for ways to streamline a process to work smarter, and not harder”.
He adds: “AI should be allowed to flourish with trust and confidence.”
Firebrand Training’s Richard Parker, who is chair of The Association of Employment and Learning Providers’ (AELP) IT & digital forum, believes providers are being “overly nervous” and “too slow” to consider AI.
He says: “They were sat there going, ‘we don’t really want to touch it. It’s not allowed, no one’s going to give us permission to use it’.
“IfATE needs to step in and give them proper guidance.”

Personal and accessible AI training
One of the biggest potential benefits of AI is how it can customise training to each learner. Data can be analysedusing AIto understand where an apprentice struggles or disengages, then chatbots powered by generative AI can provide tailored support.
Veraxis co-founder Rebecca Bradley believes using AI in this way has a “really high impact on learner retention”.
She points out that apprentices can use chatbots to answer questions they might feel embarrassed to ask a human.
“A chatbot gathers such rich information from your data that it gives a response that feels human,” she says. “We move away from the disparity of your introverts not getting as much input as your extroverts who feel confident asking questions.”
Apprenticeship and training provider Babington recently launched a pilot programme with AI learning platform Obrizum.
Director of learning design Phillip McMullan said that 94 per cent of learners on the AI learning pathways they created felt the learning was “highly personalised to their specific needs, leading to a 1.5x increase in speed to competency”. Learner satisfaction was also said to have risen 25 per cent.
Multiverse reported this week that its ‘Multiverse Atlas’ AI learning coach, launched in February, had clocked up 40,000 queries from 3,600 apprentices.
CEO Euan Blair said uptake was higher among apprentices aged over 40 and those with additional learning needs, which shows “if you’re thoughtful about how you design AI products, they can support broader access to high-quality training”.
He added: “Crucially, the overwhelming majority of users found Atlas helpful, with ‘usefulness ratings’ above 91 per cent across all demographics, including gender, age, ethnicity, and learning need.

A human touch
But will chatbots replace human tutors and coaches?
Goldman Sachs economists said last year that up to 300 million full-time jobs could be impacted globally by the rise of generative AI.
But Darren Coxon, an adviser on AI in education, does not think AI will “replace anyone student-facing any time soon” because “humans are too important to the learning journey”.
The AI platform Aptem, whose customers include training providers and colleges, claims its goal is “not to replace roles such as tutors and skills coaches” but to “allow them more space” to teach.
It claims its AI can “reduce the time taken to gather data in preparation for a review”, and “give tutors more time to pull together meaningful discussion points” as they “aren’t so focused on summarising notes and inputting information”.
“If AI can notify tutors when it looks like a learner is beginning to struggle with their training, they have the best possible chance of offering support.”
AI can also be used in the recruitment of apprentices, to automate and facilitate processes and find people with the right skills.
In April, Multiverse bought the Californian AI talent software firm Searchlight to help it identify skills gaps within organisations.
Phillip Bryant, head of EPA & apprenticeships at the International Compliance Association, a professional body for the global regulatory and financial crime compliance community, believes there are opportunities for AI to be used in assessment design and writing EPA policies or procedures, as a “tool” but “not necessarily to replace the human input”.

Is AI really cheating?
Some of the gravest concerns around generative AI are its ability to help learners cheat their way through assessments.
Because much of the assessment of apprentices is done through the observation of work-based activity and oral assessment, the ways that apprenticeships can misuse AI are more limited than in academic sectors. But that doesn’t mean it’s not happening.
Jo Wharton, an assessor at Ixion, posted last year on LinkedIn that she was “starting to notice the use of AI creeping into the evidence that apprentices provide towards their level 6 career development professional apprenticeship”.
Bryant has not noticed a spike in suspected AI plagiarism in recent months, but has seen that “a lot of higher education institutions are shifting their approach now and seem to condone it for the research aspect of assessment writing or generating ideas”.
Assessors are relying on free AI-driven diagnostic tools to spot cheating, but these are of limited reliability.
Bradley warned that “this is where we are in the wild west”, without that “legislative robustness”.
She also pointed out that if a young person is putting a “very sophisticated prompt” into a chatbot that is “not detectable”, it “shows a level of expertise” which is “entrepreneurial”.
Firebrand Training’s Parker, who is an end-point assessor of data and business analysis assessments, says there are worries about knowing how much of a report written by an apprentice was created by AI.
“But does it matter?” he asks. “If their employer pays them to have those tools to write their professional reports, should they not be allowed to use it for their assessment reports as well?”

Governance needed
Although governments and regulatory bodies have not caught up with AI in education, Bradley warns providers that they “need to put good practice in place” now, so “when the regulation does come, you’ll have the right pieces in place”.
She says when using AI to analyse an apprentice’s personal performance data, providers should “handle it securely with clear consent”. That means “letting the learners know they’re being monitored in that way” and “being clear about how that data is used”.
She advises providers to write their policies around the data rules governed by the LLM they are using, bearing in mind their company’s risk register and appetite for risk.
And Bradley explains LLMs are “biased” by nature because they take information from the internet, which is “drawing from extreme views – the left, the right and the centre”.
Those who decide therefore that sharing their apprentices’ data with an LLM is too risky might consider building their own LLM.
Lack of guidance
So far, Ofsted is taking a hands-off approach. It will not directly inspect the quality of AI tools, but said it will “consider a provider’s use” of AI “by the effect it has on the criteria set out” in its existing inspection frameworks, such as safeguarding and the quality of education.
Last year, the IfATE updated its guidance around the use of AI in end-point assessments to clarify that AI must not be used to produce a report or portfolio, and an apprentice must reference use of AI when using it within a portfolio to underpin a professional discussion or other assessment.
But Parker considers this guidance is “open ended and vague”.
“It doesn’t go anywhere near far enough to be completely helpful”, he says, because it does not state what is acceptable during the end-point assessment process.
Parker assesses the level-three data technician apprenticeship where apprentices use such tools.
He argues that it then “doesn’t make sense” if, after 12 months of using AI tools in their jobs, he as an end-point assessor has to then forbid them from using them in their assessment – because “the end point assessment is supposed to assess how competent they are at their job”.

AI for everyone
As AI is embraced in the wider workforce, it’s slowly becoming a greater feature in apprenticeship standards outside of the digital and IT subjects you’d expect to find it.
For example, London Metropolitan College has embedded AI into its level 6 project control apprenticeships.
Derby-based provider DBC Training recently started building the teaching of how to use generative AI into its marketing apprenticeships, which its curriculum lead Daniel Adey says is “giving the learners a greater understanding of [AI] and how it can be used to strengthen their marketing efforts”.
Some experts believe the apprenticeships system needs to be completely transformed through AI.
In a recent podcast, Blair said the recent rise of generative AI tools means that “everyone needs to be taught how to be a co-pilot of AI, how to work with it, the ethics of AI – this is absolutely crucial.”
Bradley believes that we’re “moving to a world” where the prompts put into LLMs are “perhaps going to be what we assess young people on, rather than what comes out of them”.
She points out that societies were “suspicious” of coffee, the printing press and photocopying machines too when they came along. But now they’re “parts of everyday life” as, she believes, AI will be for apprentices.
“We won’t have this suspicion in future,” she adds. “But for now, there are things that we need to think about and mitigate.”
My god… AI is NOT A MAGIC WAND. LLMs are, at best, sophisticated spellcheckers and nothing else. There is still no Intelligence in AI, just a massive amount of other people’s work (often acquired by dubious, if not entirely illegal, means) that it counts to vaguely appear plausible (and anyone vaguely interested in the subject has seen dozens of examples of “AI hallucinations” where it just spits out garbage). All of this counting involves huge amounts of precious energy for little to no gain.
Making a chatbot to respond to FAQs that you’ve got listed is fine but usually semi-useless if you’ve got anything nuanced to ask. I note Multiverse don’t quote any satisfaction metrics, just that people are using it…
AI is no more an answer to these problems than crypto was to financial products with all of the same drawbacks.