ChatGPT: FE institutions need a solid policy to manage the AI revolution

AI tools are on the cusp of transforming classrooms and an effective policy is crucial to ensuring institutions aren’t rocked by this revolution, writes Ruth Sparkes

AI tools are on the cusp of transforming classrooms and an effective policy is crucial to ensuring institutions aren’t rocked by this revolution, writes Ruth Sparkes

23 Apr 2023, 5:00

Imagine a world where virtual teaching assistants provide personalised learning experiences for students and administrative tasks are handled with precision and speed by artificial intelligence (AI) tools.

This is no longer a distant dream, thanks to the likes of ChatGPT and others. As this exciting technology becomes an integral part of the education landscape, it’s crucial  that institutions have a comprehensive AI policy in place for staff for four key reasons.

The first is ethical. Picture a maths lecturer using AI to provide instant feedback on a student’s algebraic equations. While this is a great example of how AI can enhance learning, it also raises questions about ensuring its ethical, responsible use in education, not least with regards to privacy, fairness and transparency.

The second reason to have an AI policy is consistency.Having lecturers at the same college using AI tools in vastly different ways might lead to a disjointed learning experience for some students, and disadvantage others outright depending on who they were taught by.

Only a consistent application of AI can ensure that all staff and students benefit from the technology while minimising the risks of misuse, over-dependence and unequal provision.

Which leads us to professional development. It is inevitable that staff will need new knowledge and skills to integrate AI into their teaching practice and administrative duties effectively. Early adopters can be viewed as an asset, a threat, or merely as a novelty, but the reality is that these tools raise the bar for digital literacy across the board.

And finally, of course, institutions must continue to navigate data protection laws like GDPR while using AI. This isn’t just about remaining compliant, but about maintaining the trust of students, staff, and external stakeholders.

Four good reasons to have an AI policy, but what should it include?

Purpose and Scope

Outline the policy’s purpose, highlighting the institution’s commitment to responsible AI usage in a way that engages and excites staff and students alike.

Roles and Responsibilities

Be clear about the roles and responsibilities of everyone involved in using AI. Consider creating an AI ethics committee that includes representatives from different departments, ensuring a diverse range of voices contribute to the responsible implementation of the policy.

Data Protection and Privacy

Offer real-life examples of how the institution will comply with data protection regulations, such as GDPR, in relation to AI usage. This will help staff understand the importance of safeguarding student privacy and handling sensitive information responsibly.

Accessibility and Inclusivity

Showcase how the policy will champion accessibility and inclusivity, ensuring that staff and students with disabilities can access and benefit from AI tools on an equal footing.

Training and Support

Design an engaging training programme to teach staff about the responsible use of ChatGPT etc. and the policy itself.  You could include hands-on workshops, interactive webinars, and develop creative resources to help staff integrate AI into their practice effectively.

Monitoring and Evaluation

Establish a dynamic system for monitoring and evaluating the AI policy’s effectiveness, as well as the impact of AI usage on student outcomes and staff experiences. Encourage staff to contribute feedback, ideas, and suggestions for improvement through various channels, such as ‘town hall’ meetings or online forums. 

Review and Update

Don’t let the policy gather dust! Schedule regular reviews to ensure the AI policy remains relevant and effective in light of emerging AI technologies and educational practices. Adapt and revise the policy as necessary to address new challenges and opportunities in the rapidly evolving world of AI. 

As AI becomes a driving force in shaping the future of FE it’s important that institutions embrace a comprehensive and engaging policy for staff. By addressing ethical concerns, promoting consistent application of AI tools, ensuring legal compliance, and enhancing digital literacy, these policies will pave the way for a new era.

By fostering a culture of inclusivity, accessibility, and ongoing improvement, FE can harness the power of AI to transform the learning experience for both staff and students.

More from this theme

AI, Ofsted

Ofsted to explore how AI can help it make ‘better decisions’

Exams regulator Ofqual also publishes AI strategy, revealing 'modest numbers' of coursework malpractice

Samantha Booth
AI, Apprenticeships

Higher-level apprenticeships ‘most exposed’ training route to AI advancement

Education jobs also among the most affected occupation by artificial intelligence

Anviksha Patel

Minister wants education providers to benefit from AI revolution

Data privacy experts consulted over future use of pupil data by artificial intelligence

Freddie Whittaker

ChatGPT: Keegan launches call for evidence on AI in education

Ministers also announced a new taskforce to look at what digital skills are needed for the future

Samantha Booth

ChatGPT: Consider reviewing homework polices, DfE tells colleges

The government has set out its stance on the use of generative AI in education

Samantha Booth

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *