AI is moving too fast for colleges to go it alone

In FE, no single teacher, regulator or IT lead can keep pace with AI’s risks and opportunities. But together through communities of practice, we can cut through the noise and turn uncertainty into practical guidance

In FE, no single teacher, regulator or IT lead can keep pace with AI’s risks and opportunities. But together through communities of practice, we can cut through the noise and turn uncertainty into practical guidance

1 Sep 2025, 6:55

AI is moving faster than any of us can comfortably keep up with. New tools emerge almost daily, each bringing opportunities for teaching and learning but also challenges for assessment, safeguarding and governance. No single college or individual can hope to stay on top of it all. That’s why communities matter. 

In further education we face a uniquely broad set of stakeholders when it comes to AI. Teachers want to know how to use it to save time and support learning. Regulators and awarding bodies wrestle with the questions of integrity. Safeguarding teams need to understand risks to learners. IT departments are focused on security, and governors are asking about strategic implications, while learners need clear guidance on how to use it effectively and responsibly.  

Work based learning adds further perspectives on the changing skills needed by employers. Each view is legitimate, but without spaces to share and work together the risk is fragmentation and duplication. 

Communities of practice are uniquely placed to respond to this complexity. By bringing diverse voices together, they allow us to cut through the noise and focus on what really helps learners and staff. Over the past year, Jisc’s AI in FE community has demonstrated the power of this approach. Scores of staff from across the sector have been connecting, comparing notes and tackling common problems. 

As an example, we brought together staff from nine colleges to explore how AI is reshaping assessment. We didn’t begin with a fixed outcome, but with shared questions about fairness, integrity and the skills learners need. Over several months, the group considered assessment from different angles – from design to learner AI literacy and wellbeing.  

The result was a set of top tips structured around what staff can do before and after assessment. Before assessment: set clear expectations, design tasks that promote higher-order thinking, and create safe opportunities for learners to practise and reflect on their AI use. After assessment: approach suspected misuse with empathy, check understanding through multiple methods, and build in time to reflect on what worked well.  

It is not a strict set of rules, but practical, adaptable guidance that colleges can tailor to their own context 

The process behind this work is as valuable as the output. Because the guidance was shaped by practitioners across roles and institutions, it is trusted and grounded in reality. Staff can see their own concerns reflected and know it was not written in isolation.  

This collaborative approach also means the guidance will not stand still. As tools evolve, so too will the advice. Communities create the conditions for living guidance: a resource that can be updated, debated and improved as the technology – and our understanding – develops. In a landscape moving as quickly as AI, that agility is vital. 

The benefits go further. Communities reduce duplication by sharing solutions openly, so that every college does not have to reinvent the wheel. They help staff respond at speed without feeling isolated. And they give the sector a more confident, unified voice when engaging with policymakers or technology providers. 

Most importantly, communities show us that we are not navigating these changes alone. At times it can be easy to feel overwhelmed, to believe we are always behind. But working together reveals the collective expertise and creativity that already exists across FE. By pooling that knowledge, we can not only keep pace but collaboratively shape how AI is used for the benefit of learners. 

As AI continues to evolve, the role and importance of communities will only grow. They are how we make sense of change together, how we ensure diverse perspectives are heard, and how we turn uncertainty into practical guidance. If AI is going to reshape education, then communities are how we make sure it does so on our terms. 

If you’d like to get involved with our AI community, you can find more on the Jisc website

Latest education roles from

Deputy Principal Finance & Facilities – HSDC

Deputy Principal Finance & Facilities – HSDC

FEA

Executive Principal

Executive Principal

Lift Rawlett

Head Teacher

Head Teacher

Green Meadow Primary School

Director of Admissions

Director of Admissions

Greene's College Oxford

Sponsored posts

Sponsored post

How Eduqas GCSE English Language is turning the page on ‘I’m never going to pass’

“A lot of learners come to us thinking ‘I’m rubbish at English, and I’m never going to pass’,” says...

Advertorial
Sponsored post

Fragmentation in FE: tackling the problem of disjointed tech, with OneAdvanced Education

Further education has always been a place where people make complexity work through dedication and ingenuity. Colleges and apprenticeship...

Advertorial
Sponsored post

Teaching leadership early: the missing piece in youth employability

Leaders in education and industry are ready to play their part in tackling the UK’s alarming levels of youth...

Advertorial
Sponsored post

Bett UK 2026: Learning without limits

Education is humanity’s greatest promise and our most urgent mission.

Tyler Palmer

More from this theme

AI, Skills reform

AI Skills Hub risks ‘copy and paste of past failure’

New AI skills hub initiative reeks of pandemic-era 'skills toolkits' failures

Anviksha Patel
AI

Ofsted reveals how it will inspect providers’ AI use

Inspectors will not check tech use as a ‘standalone’ part of inspections, but will look at its impact on...

Jack Dyson
AI, Colleges

AI guidance for colleges: 9 key findings for leaders

Government toolkits say colleges should train staff on safe AI use and to spot deep-fakes

Jack Dyson
AI

FE providers wanted to become edtech ‘testbeds’

Pilot to build 'evidence base' on impact of workload-cutting tech

Jack Dyson

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *