AI can pass almost all level 3 assessments, study finds

OU recommends that institutions should focus on question design rather than detecting AI misuse

OU recommends that institutions should focus on question design rather than detecting AI misuse

Artificial intelligence (AI) tools can pass almost all types of level 3 assessments, a new study has found.

The Open University found that AI performed “particularly highly” at this advanced level across a range of subjects, although its performance was lower at higher levels of 4 and above.

It also found that while markers’ ability to detect generative AI answers increased after training, this was undermined by an increased number of false positives.

Jonquil Lowe, senior lecturer in economics and personal finance, said that rather than focusing on detection, colleges and universities should use AI to design more “robust questions” that focus on the “added value” that humans bring.

He added: “This shifts us away from merely testing knowledge, towards what is often called ‘authentic assessment’ that requires explicit application of what has been learned in order to derive specific conclusions and solutions.”

The study confirms fears raised in a recent FE Week investigation, that students can ‘cheat’ their way through almost any non-exam assignment by using large language models of generative AI such as ChatGPT.

It also addresses concerns that AI detection tools are unreliable, giving rise to false accusations and a breakdown of trust between educators and students.

The study, funded by awarding body and education charity, NCFE, analysed generative AI’s performance by asking a group of 43 markers to grade almost one thousand scripts and to flag those they suspected were AI-generated.

A review of the results found that the most robust assessment types were audience-tailored, observation by learner and reflection on work practice.

While the study found that subjects did not affect AI’s performance, certain disciplines such as law, were easier to detect.

False positives emerged as “hallmarks” of AI-generated scripts, such as superficial answers or not focusing on the question, are also common in weaker students’ work.

The study recommends that institutions designing assessments should focus on question design marking guidance and student skills interventions rather than detecting AI misuse.

When students are identified as using AI in their assessments, institutions should focus on helping them develop their study skills.

Training for dealing with generative AI in assessments should also be ongoing.

Gray Mytton, assessment innovation manager at NCFE, said: “This report highlights the challenges in detecting genAI misuse in assessments, showing that training markers to spot AI-generated content can lead to an increase in the rate of false positives.  

“To address this, educators could help students develop study skills, including genAI use where appropriate, while awarding bodies can focus on creating more authentic assessments, which will also benefit learners as they enter the workforce.”

Read the full report here.

Latest education roles from

Head of MIS and Student Records – North Hertfordshire College

Head of MIS and Student Records – North Hertfordshire College

FEA

Governor

Governor

Capital City College Group

Head of Safeguarding & Wellbeing

Head of Safeguarding & Wellbeing

Capital City College Group

Chief Executive Officer

Chief Executive Officer

Excelsior Multi Academy Trust

Sponsored posts

Sponsored post

Stronger learners start with supported educators

Further Education (FE) and skills professionals show up every day to change lives. They problem-solve, multi-task and can carry...

Advertorial
Sponsored post

Preparing learners for work, not just exams: the case for skills-led learning

As further education (FE) continues to adapt to shifting labour markets, digital transformation and widening participation agendas, providers are...

Advertorial
Sponsored post

How Eduqas GCSE English Language is turning the page on ‘I’m never going to pass’

“A lot of learners come to us thinking ‘I’m rubbish at English, and I’m never going to pass’,” says...

Advertorial
Sponsored post

Fragmentation in FE: tackling the problem of disjointed tech, with OneAdvanced Education

Further education has always been a place where people make complexity work through dedication and ingenuity. Colleges and apprenticeship...

Advertorial

More from this theme

Assessment

Ofqual scrutinising Edexcel’s A-level maths replacement paper

Over 2,000 students sign petition after substitute paper ‘lacked key topics’ 

Freddie Whittaker
Apprenticeships, Assessment

Ofqual publishes ‘flexible’ apprenticeship assessment rules

Watchdog sets out how it will do away with end point assessment and monitor training providers marking their own...

Shane Chowen
Assessment

Ofqual demands ‘honesty’ in new rules for awarding organisations

Proposals include 'principles' that could see sanctions on organisations that undermine public confidence in qualifications

Shane Chowen
Assessment, Awarding

Price of exams soar above inflation in 2024

General qualifications rise 6.4% while VTQs increase 5.5%

Anviksha Patel

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *