ChatGPT: FE institutions need a solid policy to manage the AI revolution

AI tools are on the cusp of transforming classrooms and an effective policy is crucial to ensuring institutions aren’t rocked by this revolution, writes Ruth Sparkes

AI tools are on the cusp of transforming classrooms and an effective policy is crucial to ensuring institutions aren’t rocked by this revolution, writes Ruth Sparkes

23 Apr 2023, 5:00

Imagine a world where virtual teaching assistants provide personalised learning experiences for students and administrative tasks are handled with precision and speed by artificial intelligence (AI) tools.

This is no longer a distant dream, thanks to the likes of ChatGPT and others. As this exciting technology becomes an integral part of the education landscape, it’s crucial  that institutions have a comprehensive AI policy in place for staff for four key reasons.

The first is ethical. Picture a maths lecturer using AI to provide instant feedback on a student’s algebraic equations. While this is a great example of how AI can enhance learning, it also raises questions about ensuring its ethical, responsible use in education, not least with regards to privacy, fairness and transparency.

The second reason to have an AI policy is consistency.Having lecturers at the same college using AI tools in vastly different ways might lead to a disjointed learning experience for some students, and disadvantage others outright depending on who they were taught by.

Only a consistent application of AI can ensure that all staff and students benefit from the technology while minimising the risks of misuse, over-dependence and unequal provision.

Which leads us to professional development. It is inevitable that staff will need new knowledge and skills to integrate AI into their teaching practice and administrative duties effectively. Early adopters can be viewed as an asset, a threat, or merely as a novelty, but the reality is that these tools raise the bar for digital literacy across the board.

And finally, of course, institutions must continue to navigate data protection laws like GDPR while using AI. This isn’t just about remaining compliant, but about maintaining the trust of students, staff, and external stakeholders.

Four good reasons to have an AI policy, but what should it include?

Purpose and Scope

Outline the policy’s purpose, highlighting the institution’s commitment to responsible AI usage in a way that engages and excites staff and students alike.

Roles and Responsibilities

Be clear about the roles and responsibilities of everyone involved in using AI. Consider creating an AI ethics committee that includes representatives from different departments, ensuring a diverse range of voices contribute to the responsible implementation of the policy.

Data Protection and Privacy

Offer real-life examples of how the institution will comply with data protection regulations, such as GDPR, in relation to AI usage. This will help staff understand the importance of safeguarding student privacy and handling sensitive information responsibly.

Accessibility and Inclusivity

Showcase how the policy will champion accessibility and inclusivity, ensuring that staff and students with disabilities can access and benefit from AI tools on an equal footing.

Training and Support

Design an engaging training programme to teach staff about the responsible use of ChatGPT etc. and the policy itself.  You could include hands-on workshops, interactive webinars, and develop creative resources to help staff integrate AI into their practice effectively.

Monitoring and Evaluation

Establish a dynamic system for monitoring and evaluating the AI policy’s effectiveness, as well as the impact of AI usage on student outcomes and staff experiences. Encourage staff to contribute feedback, ideas, and suggestions for improvement through various channels, such as ‘town hall’ meetings or online forums. 

Review and Update

Don’t let the policy gather dust! Schedule regular reviews to ensure the AI policy remains relevant and effective in light of emerging AI technologies and educational practices. Adapt and revise the policy as necessary to address new challenges and opportunities in the rapidly evolving world of AI. 

As AI becomes a driving force in shaping the future of FE it’s important that institutions embrace a comprehensive and engaging policy for staff. By addressing ethical concerns, promoting consistent application of AI tools, ensuring legal compliance, and enhancing digital literacy, these policies will pave the way for a new era.

By fostering a culture of inclusivity, accessibility, and ongoing improvement, FE can harness the power of AI to transform the learning experience for both staff and students.

Latest education roles from

Director of Curriculum & Skills

Director of Curriculum & Skills

Gateshead College

Chief Education Officer (Secondary)

Chief Education Officer (Secondary)

Altus Education Partnership

Chief Financial Officer

Chief Financial Officer

Bath College

Programme Manager (English and Maths)

Programme Manager (English and Maths)

CITB

Sponsored posts

Sponsored post

Apprenticeship reform: An opportunity to future‑proof skills and unlock career pathways

The apprenticeship landscape is undergoing one of its most significant transformations in decades, and that’s good news for learners,...

Advertorial
Sponsored post

Stronger learners start with supported educators

Further Education (FE) and skills professionals show up every day to change lives. They problem-solve, multi-task and can carry...

Advertorial
Sponsored post

Preparing learners for work, not just exams: the case for skills-led learning

As further education (FE) continues to adapt to shifting labour markets, digital transformation and widening participation agendas, providers are...

Advertorial
Sponsored post

How Eduqas GCSE English Language is turning the page on ‘I’m never going to pass’

“A lot of learners come to us thinking ‘I’m rubbish at English, and I’m never going to pass’,” says...

Advertorial

More from this theme

Awarding, Teaching, Young people

Student AI confessions prompted rethink, says Bauckham

Ofqual to assess awarding orgs' AI cheating policies while chief commits to 'no easy' V Levels

Shane Chowen
Skills reform, Teaching

AI Skills Hub risks ‘copy and paste of past failure’

New AI skills hub initiative reeks of pandemic-era 'skills toolkits' failures

Anviksha Patel
Ofsted, Teaching

Ofsted reveals how it will inspect providers’ AI use

Inspectors will not check tech use as a ‘standalone’ part of inspections, but will look at its impact on...

Jack Dyson
Colleges, Teaching

AI guidance for colleges: 9 key findings for leaders

Government toolkits say colleges should train staff on safe AI use and to spot deep-fakes

Jack Dyson

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *