Ofsted reveals how it will inspect providers’ AI use

Inspectors will not check tech use as a ‘standalone’ part of inspections, but will look at its impact on outcomes

Inspectors will not check tech use as a ‘standalone’ part of inspections, but will look at its impact on outcomes

27 Jun 2025, 16:06

More from this author

Ofsted will not assess FE providers’ use of AI “as a stand-alone part” of inspections – but the tech’s impact on outcomes for learners will be looked at. 

Guidance published by the watchdog this morning reveals how it will evaluate FE providers’ use of such tools during inspections.

The inspectorate has also released new research into so-called AI “early adopter” school and FE colleges – which revealed that some are developing their own chatbots. 

Ofsted chief Sir Martyn Oliver said: “As the use of AI in education increases, we need to better understand how schools and colleges are using this technology to take advantage of its potential, as well as manage the risks it poses for pupils, learners and staff.”

AI not standalone part of Ofsted checks

The new guidance states inspectors will not look at providers’ use of the technology “as a stand-alone part” of their assessments and won’t “directly evaluate” its use.

Part of the reason for this is Ofsted does not “not have the evidence we would need to define good use of AI for the purposes of inspection or regulation”. 

But Oliver stressed the watchdog “can consider the impact a provider’s use has on the outcomes and experiences of children and learners”.

Any evaluation “of the use of AI will ask whether the provider has made sensible decisions”. 

As part of this, inspectors could ask how leaders ensure any use “supports the best interests of children and learners”. 

Assessing AI’s risks

When any AI is used by learners at an FE provider, inspectors will assess if this is being done in their “best interests”. 

If learners are deemed to be using it “inappropriately, inspectors may evaluate how the provider has responded and addressed the impact of this”. 

The guidance says that while the risks associated with the tech “will not be evaluated separately in our inspections, they will be addressed when they have implications for areas that are already considered”. 

This can include data protection, safeguarding and bias and discrimination. 

Any evaluation that Ofsted makes is about the provider’s decision-making, what they have considered, and the impacts on children and learners, not about the tool itself,” the guidance says.

“Inspectors only need to consider AI when it is relevant to something specific in their evaluations.”

‘Early adopter’ research

The guidance was informed by research, also released this morning, into how 21 “early adopter” schools, colleges and MATs are integrating AI into teaching, learning and admin.

Research suggested that FE colleges were “more likely” to permit learners to use AI unsupervised due to their age.

In total, four senior college staff spoke to researchers earlier this year – they had all started their “AI journey” in 2022 and 2023 

Most settings had an “AI champion” charged with getting staff to “embrace” the tech. They typically created a “buzz” and “played a vital role in demystifying” it to address “anxieties and build confidence”. 

In larger settings champions would bring “together their data management teams, IT systems managers and curriculum leads” as AI “requires skills and knowledge across more than one department”.

One champion, an FE college’s director of digital transformation, said a demonstration session was a “turning point” that convinced staff of ChatGPT’s significance in terms of both teacher workload and learning.

He told Ofsted: ““One of the first things the principal said was, ‘This is an employability skill I need students to have.’”

A critical lesson for staff was effective use of prompts. They added: “If you put junk in, you’ll get junk out.”

AI’s use was usually divided between those wanting to cut workload and those who wanted it to “directly” support learning. But the researchers found this “often shifted with time”. 

A few of the leaders were already developing and testing “their own AI chatbot, while others were in the process of doing so”. 

Some also highlighted how tools “allowed teachers to personalise and adapt resources, activities and teaching for different groups of pupils”, including young carers and refugees. 

A ‘wild west’

However, most were “at the early stages of developing a longer-term strategy” on how to integrate AI into their curriculums as they had not yet considered how to combine it with pedagogy. 

One of the reasons for this is there are “not many… tools tailored to individual school or college contexts”. Some bosses also had not thought “strategically about what success with AI looked like or how to evaluate its impact”.

One college deputy principal said it was difficult to envision a wider strategy as they were also learning “day by day”.

A principal said: “It’s the Wild West and all we are at the minute is the sheriff. What comes in and what goes out of the town is what we’re managing to deal with at the minute.”

Safe use

Leaders were said to be “clear about the risks of AI around bias, personal data, misinformation and safety”. 

Some had a separate AI policy, while others added it to “relevant existing policies including those for safeguarding, data protection, staff conduct, and teaching and learning”. 

But “the pace of change meant that many leaders were updating their AI policies as often as monthly”. 

Earlier this month, the government published toolkits and guidance on how colleges should plan to use AI.

Latest education roles from

Executive Head Teacher (Trust-wide SEND)

Executive Head Teacher (Trust-wide SEND)

The Legacy Learning Trust

Director of Governance

Director of Governance

Wigan & Leigh College

Deputy Principal Finance & Facilities – HSDC

Deputy Principal Finance & Facilities – HSDC

FEA

Executive Principal

Executive Principal

Lift Rawlett

Sponsored posts

Sponsored post

Preparing learners for work, not just exams: the case for skills-led learning

As further education (FE) continues to adapt to shifting labour markets, digital transformation and widening participation agendas, providers are...

Advertorial
Sponsored post

How Eduqas GCSE English Language is turning the page on ‘I’m never going to pass’

“A lot of learners come to us thinking ‘I’m rubbish at English, and I’m never going to pass’,” says...

Advertorial
Sponsored post

Fragmentation in FE: tackling the problem of disjointed tech, with OneAdvanced Education

Further education has always been a place where people make complexity work through dedication and ingenuity. Colleges and apprenticeship...

Advertorial
Sponsored post

Teaching leadership early: the missing piece in youth employability

Leaders in education and industry are ready to play their part in tackling the UK’s alarming levels of youth...

Advertorial

More from this theme

AI, Skills reform

AI Skills Hub risks ‘copy and paste of past failure’

New AI skills hub initiative reeks of pandemic-era 'skills toolkits' failures

Anviksha Patel
AI, Colleges

AI guidance for colleges: 9 key findings for leaders

Government toolkits say colleges should train staff on safe AI use and to spot deep-fakes

Jack Dyson
AI

FE providers wanted to become edtech ‘testbeds’

Pilot to build 'evidence base' on impact of workload-cutting tech

Jack Dyson
AI

AI tool for electronics teaching developed by ex-IfATE board member

The tool was funded through a competition for AI tools that could save teachers time

Josh Mellor

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment

  1. Phillip Hatton

    Why not put the introduction of the new inspection model back to January and use Nov/Dec for training the sectors Ofsted inspects, along with inspectors, while conducting ‘mock’ inspections and sharing the lessons learnt? We all know that would make sense, but do Ofsted and the DFE?