The AI genie is well and truly out of the bottle.
AI’s influence on further education isn’t a slow burn. It’s quickly reshaping how we teach, assess and think about knowledge itself. In FE, conversations about AI have often centred on practical implementation and staff efficiency – important goals in themselves.
But for AI integration to be effective, students will need support in developing digital judgement and applying these tools with care. The challenge now is to build the confidence, skills and judgement needed to make AI a force for better learning – not just faster work.
Confidence isn’t competence
According to the upcoming Pearson College Report 2025, launching next month, 62% of college students feel confident using AI to support their learning. That’s the good news. The twist? Many don’t feel confident choosing appropriate tools, applying them accurately and fairly, or judging the quality of AI content.
One in five say they want to learn how to use AI more accurately and fairly, and nearly as many say they need help understanding how to use it ethically. Meanwhile, most tutors agree the curriculum needs to evolve to embed digital and AI skills, and over half say they need more support themselves.
There’s a gap – a significant one – between how students are using AI and what they’re being taught. Tutors can see it. Many cite the increasing use of AI in teaching, learning and assessment as one of the top challenges they’ll face this year.
While some learners are already confident with AI, many are still experimenting – copying, pasting and refining prompts without clear guidance on how to use these tools well.

Why digital confidence matters
This isn’t about banning AI or policing behaviour. For AI integration to be truly effective, guidance and digital judgement will be key.
As AI becomes woven into everyday life and learning, familiar priorities like safeguarding, academic integrity and employability are being shaped by AI – adding new layers to the digital landscape that students must navigate.
Can learners recognise bias in AI outputs? Do they know when a chatbot is bluffing? Can they credit their sources, explain their thinking and use AI as part of their own process rather than instead of it? Those are the skills that turn AI from a shortcut into a genuine support tool.
A practical place to start
At Basingstoke College of Technology (BCoT), staff could see the gap – so they decided to act.
Supported by Pearson, they developed AI Essentials, a short, self-paced course that introduces students to responsible and reflective use of AI. It’s not a qualification or a coding module. It’s a 90-minute foundation designed to build confidence, curiosity and awareness.
Delivered during induction or tutorial sessions, it explores questions such as:
- What exactly is AI, and where do we come across it?
- What makes an AI-generated answer helpful or harmful?
- How can students use these tools without crossing ethical lines?
- What does fairness look like when a chatbot can write your essay?
Richard Harris, a Digital and IT Lecturer at BCoT, saw the impact straight away. “It was fantastic to see students not just getting excited about the topic but really starting to think critically about the content they consume every day. It’s given them up-to-date, practical skills that will be vital for their future.”
The college worked with Pearson to host the course on ActiveHub, making it available across departments under a site licence. It’s designed to flex around different courses and teaching schedules – the aim is to start a conversation, not add another layer of workload.

What’s at stake
A recent report from the Institute for the Future of Work ranks AI literacy among the top priorities for employers across every sector. They’re not just looking for coders – they’re looking for critical thinkers who can use technology thoughtfully and responsibly.
If students aren’t supported to use AI well, we could see a new kind of digital divide – not based on access, but on understanding. That gap could quietly influence learning outcomes, confidence and future opportunities.
With its close ties to both employers and learners, further education is well placed to help close that gap. Not through sweeping reforms or expensive new frameworks, but through small, structured steps that bring AI into everyday learning in a safe, thoughtful way.
Anthony Bravo OBE, Principal of BCoT, added: “This isn’t about being cutting edge. It’s about being responsible. Our job is to get students ready for what’s next – to help them make smart, ethical choices with AI, now and in the future.”
The genie is already out of the bottle. We don’t need to put it back in. We just need to learn how to work with it – and help students do the same.
Find out more about the AI Essentials course developed by BCoT and supported by Pearson: Access your sample pack
Your thoughts