Data subject access requests are formal requests made by an individual asking for a copy of the personal data that an organisation holds about them. A ‘data subject’ can often be a learner or parent, or someone with authorisation to act on their behalf such as a solicitor.
In many cases, a DSAR might be the consequence of a disagreement or disruption to the relationship; for example, a learner might be unhappy about the consequences of a behaviour issue; a staff member may be contesting disciplinary processes or parents could be unhappy about special educational needs provision for their child.
DSARs have long been required under data protection laws, having been established well before the General Data Protection Regulation (GDPR) came into force in 2018. So colleges already have systems in place to manage them within 30 days.
But in recent times, myself and my colleagues have seen a rising number of learners and parents use AI to write these requests. AI makes generating DSARs easier, which is likely to increase their volume and strain college resources.
Another noticeable change is the scope of these requests. AI tools often generate broad, sweeping language such as “all information held about me,” even when the individual is only interested in a specific incident or timeframe. Colleges may not realise they can ask the requestor to refine broad requests.
AI-generated DSARs also include references to legislation, case law or regulatory guidance. While this can make the request appear more formal or urgent, the references are not always accurate; they may misquote laws or cite legal cases that do not exist or are unrelated to data protection.
These requests may also ask for different deadlines such as seven days rather than the 30 days permitted by law.
This legalistic tone can be intimidating for staff. A request filled with jargon may be perceived as a threat or complaint, even when the individual simply wants to understand what data is held about them.
It’s important for colleges to recognise that the use of AI doesn’t necessarily reflect the requestor’s own understanding. They may not realise the implications of the language used or the breadth of the request they’ve submitted, which in turn creates unrealistic expectations of what the data subject is going to receive from the college in response. This is compounded by DSARs frequently being submitted during disputes, often after formal complaints have failed. In these situations, requestors typically seek detailed records and are less likely to be flexible or understanding.
To manage these new style requests, colleges should ensure frontline teams and administrators can identify requests. It’s never helpful for a data protection lead to be sent a request by the reception team with only five days left to respond, because that team hadn’t known who to share it with.
Clear communication with the requestor is also vital. If a request is too broad or contains inaccuracies, the data protection lead should explain the issue and offer guidance on how to refine it.
Colleges cannot prohibit the use of AI or mandate that a data subject complete a specific form, but guidance or a template can assist individuals in submitting requests.
Despite the challenges, it’s important to recognise that the rise of AI-generated DSARs is a positive development in many respects. Students and parents may face barriers such as limited knowledge, language difficulties or lack of confidence in formal writing, and AI tools offer a way to overcome these obstacles.
This is especially valuable for vulnerable groups, such as those with SEND, who may otherwise struggle to submit a request. The growing use of AI means that colleges may need to refresh their governance and responsive processes.
Your thoughts