Category Archives: Data Protection

Could the right to erasure in data protection law break AI?

[reposted from my LinkedIn account]

I ask this only partly in jest.

The story of how ChatGPT refused to acknowledge the existence of “David Mayer” and some others, perhaps (probably?) because people with those names had exercised their erasure rights (such as the right at Articles 17 of the GDPR and the UK GDPR), raises the interesting question: if a sufficient number of people made such requests, would the LLM begin to fail?

If so, a further question of rights arises. If I, Jon Baines, exercise my erasure right against ChatGPT (or another platform/LLM), and it suppresses any processing of the words “Jon Baines”, what effect might that have on my namesake Jon Baines, and his travel company? Or Jon the Ocean Specialist working on the Ocean Watch program?

Because the words “Jon Baines”, in isolation are not my personal data. In isolation, they do not relate to me. A crude response to an erasure request, just as with any of the other crude approaches which AI is capable of (for instance in relation to accuracy), runs the risk of interfering with others’ rights, including rights to operate a business, our rights to freedom of expression.

I don’t have an answer, but this is just one extra point and possible flaw in AI which will no doubt play out over the coming years.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under AI, Data Protection, erasure

The Data Protection Act 2018 does not “implement” the GDPR

They are separate instruments and the GDPR, pre-Brexit, did not require implementation – as a Regulation of the European Parliament and of the Council of the European Union, it had direct effect.

Since Brexit, by the effect of, among other laws, the European Union (Withdrawal Act) 2018 and the Data Protection, Privacy and Electronic Communications (Amendments Etc.) (EU Exit) Regulations 2019, we now have a retained-and-assimilated domestic version of the GDPR, called the UK GDPR.

Most processing of personal data is subject to the UK GDPR. The Data Protection Act 2018 deals with processing that is not subject to it, such as by law enforcement and security service agencies. It also provides some of the conditions and exemptions in relation to processing under the UK GDPR.

[None of this is new, and none of it will be unknown to genuine practitioners in the field, but I’m posting it here as a convenient sign to tap, at appropriate moments.]

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, UK GDPR

Banks to be required to snoop on customers’ accounts

[reposted from my LinkedIn account]

A recently announced “DWP Fraud, Error and Debt Bill” will propose obligations on banks and financial institutions to “examine their own data sets to highlight where someone may not be eligible for the benefits they are being paid” and share relevant information with the Department of Work and Pensions (DWP).

This appears to be a new approach to the broad powers which would have been conferred on the DWP under clause 131 and schedule 11 of the shelved Data Protection and Digital Information Bill. Under those provisions the DWP would have been able to require banks and financial institutions to give general access to customer accounts (rather than on a targeted basis) for the purpose of identifying benefit fraud. Although the proposed powers were subject to a fair deal of criticism on the grounds of disproportionality, they remained in the final version of the bill which would almost certainly have been enacted if Mr Sunak had called a later election.

The DWP Fraud, Error and Debt Bill (which has not yet been introduced into Parliament but will be this session – so probably by Spring 2025) will propose an “Eligibility Verification measure” which, in figurative terms, will result in server side snooping on accounts (i.e. by banks themselves) rather than the demand-side snooping the previous bill would have introduced.

We will have to wait for the details, but one thing is certain – this will require a lot of algorithmic automation, no doubt AI-driven, and the potential for errors will need to be addressed and mitigated.

It will also, surely, be a significant cost burden on banks and financial institutions. Whilst it’s generally hard to muster much sympathy in those circumstances, here we must consider the risk that the lowest-cost, highest-efficiency models which will be adopted may be the least protective of customers’ banking privacy and data protection rights.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, data sharing, DWP, Privacy

ICO Annual Reports 1985 to date

I’ve had to retrieve a lot of these from the National Archives Web Archive.

The sixteenth report looks like it was co-published with the Australian Commissioner.

All reports are published under the Open Government Licence by which the licensor grants a worldwide, royalty-free, perpetual, non-exclusive licence to use the information, subject to conditions.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner

Can directors and trustees of charities be controllers?

[reposted from LinkedIn]

Savva v Leather Inside Out & Ors [2024] EWHC 2867 (KB), Sam Jacobs of Doughty Street Chambers, instructed by Forsters LLP for the defendants (the applicant in the instant application)

Is it the case that a director or trustee of a charity (which is a controller) cannot be a controller? That, in effect, was one of the grounds of an application by two defendants to strike out and grant summary judgment in a claim arising from alleged failures to comply with subject access requests.

The claim arises from a dispute between the claimant, a former prisoner, employed by a subsidiary of a charity (“Leather Inside Out” – currently in administration), and the charity itself. The claim is advanced against the charity, but also against the charity’s founder and two trustees, who are said on the claim form to be controllers of the claimant’s data, in addition to, or jointly with, the charity.

In a solid judgment, Deputy Master Alleyne refused to accept that such natural persons were not capable of being a controller: the term is given a broad definition in Article 4(7) UK GDPR, and “includes a natural or legal person, public authority, agency or other body and that there may be joint controllers. On plain reading of the provisions, it is incorrect to suggest that an allegation of joint controllers is, per se, not a legally recognisable claim” (re Southern Pacific Loans applied).

However, on the specific facts of this case, the pleading of the claimant (the respondent to the strike out application) failed “to allege any decisions or acts in respect of personal data which were outside the authority of the trustees as agents for [the charity]…the Respondent’s submissions demonstrated he wrongly conflated the immutable fact that a legal person must have a natural person through whom its decisions are carried into effect, with his case that the natural person must be assuming the defined status of data controller in their personal capacity”. That was not the case here – the founder and the trustees had not acted other than as agents for the charity.

Accordingly, the strike out application succeeded (notably, though, there Deputy Master said he had reached his conclusion
“not without some caution”).

Assuming the claim goes forward to trial, therefore, it can only be advanced against the charity, as sole controller.


The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under charities, controller, Data Protection, judgments, subject access, UK GDPR

Dismissed FE teacher’s data protection, MOPI, HRA claims fail

[reposted from LinkedIn]

Claims in misuse of private information, data protection and for breach of the Human Rights Act, by a dismissed further education teacher against Tameside College and three employees are struck out/subject to summary judgment for the defendant.

The claimant was initially suspended after evidence came to light that he had been dismissed from previous roles. The College’s investigation involved the sending of reference requests to two previous employers, and was also informed by disclosures of Facebook and WhatsApp messages which revealed the teacher had, contrary to instruction, communicated with students on social media whilst suspended, and “sent a threatening message to a WhatsApp Group chat comprising members of staff”.

The deputy master found that in relation to the misuse of private information claims, although the claimant had a reasonable expectation of privacy in the social media messages, “those expectations were greatly outweighed by the need to investigate those messages for the purposes of the disciplinary process”. These were subject to summary judgment for the defendant.

The data protection and human rights claims against individual employees were bound to fail, as they were neither data controllers nor public authorities.

As to the data protection claim against the college, a previous determination by the ICO that the sending of the reference requests was not fair and transparent, because it was contrary to the claimant’s expectations, was wrong: it was “plain that it ought to have been well within the Claimant’s reasonable expectation that, in order to investigate whether he had failed to disclose the fact of his dismissal from those two institutions, each would be contacted and asked about it.”

The college’s processing was lawful under Article 6(1)(b) and (c) of the UK GDPR: “The processing was necessary for the purposes of the contract of employment between the [college] and the Claimant and for the performance of the [college’s] obligations to its other staff, and to safeguard and promote the welfare of its students.” The various safeguarding legal duties and obligations on the college established a clear legal basis for the processing.

Similarly, the human rights claims against the college, which included complaints of unlawful monitoring and surveillance, were bound to fail: “There is no real prospect of establishing a breach of Article 8 for the same reasons that there is no real prospect of establishing misuse of private information. The alleged breaches of Articles 10 and 11 appear to relate to the College’s instructions to the Claimant not to communicate with other staff except with permission. The instruction was plainly a reasonable one made for a legitimate purpose.”

Accordingly, the data protection and Human Rights Act claims were struck out.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, employment, Further education, human rights, Information Commissioner, judgments, LinkedIn Post, misuse of private information

Pacini & Geyer v Dow Jones – at the interface between libel and data protection

[reposted from LinkedIn]

This is an important judgment on preliminary issues (the second preliminary issues judgment in the case – the first was on an unsuccessful strike out application by the defendants) in a data protection claim brought by two businessmen against Dow Jones, in relation to articles in the Wall Street Journal in 2017 and 2018. The claim is for damages and for erasure of personal data which is said to be inaccurate.

It is believed to be the first time in a data protection claim that a court has been required to determine the meaning of personal data as a preliminary issue in an accuracy claim.

Determination of meaning is, of course, something that is common in defamation claims. The judgment is a fascinating, but complex, analysis of the parallels between determining the meaning of personal data in a publication and determining the meaning of allegedly defamatory statements in a publication. Although the judge is wary of importing rules of defamation law, such as the “single meaning rule” and “repetition rule” a key part of the discussion is taken up by them.

The single meaning rule, whereby “the court must identify the single meaning of a publication by reference to the response of the ordinary reader to the entire publication” (NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB)) is potentially problematic in a data protection claim such as this where the claimants argue that it is not the ordinary reader they are concerned about, but a reader who might be a potential business investor.

Similarly, it is not at all clear that the repetition rule, which broadly seeks to avoid a defamatory loophole by which someone argues “but I’m only reporting what someone else said – their words might be defamatory, but mine merely report the fact that they said them”, should carry over to data protection claims, not least because what will matter in defamation claims is the factual matrix at the time of publication, whereas with data protection claims “a claim for inaccuracy may be made on the basis that personal data are inaccurate at the time of the processing complained of, including because they have become misleading or out of date, regardless of whether they were accurate at the time of original publication. In that event, what matters is the factual matrix at the time when relief is sought” (at 66).

Nonetheless, and in a leap I can’t quite follow on first of the judgment, but which seems to be on the basis that the potential problems raised can be addressed at trial when fairness of processing (rather than accuracy) arises, the judge decides to determine meaning on a single meaning/repetition rule basis (at 82-84).

There’s a huge amount to take in though, and the judgment demands close reading (and re-reading). If a full trial and judgment ensue, the case will probably be a landmark one.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under accuracy, Data Protection, Data Protection Act 2018, judgments, UK GDPR

ICO, Clearview AI and Tribunal delays

[reposted from LinkedIn]

On 28 October the Information Commissioner’s Office (ICO) made the following statement in respect of the November 2023 judgment of the First Tier Tribunal upholding Clearview AI’s successful appeal of the ICO’s £7.5m fine, and posted it in an update to its original announcement about appealing:

The Commissioner has renewed his application for permission to appeal the First-tier Tribunal’s judgment to the Upper Tribunal, having now received notification that the FTT refused permission of the application filed in November 2023.

It is extraordinary that it has taken 11 months to get to this point.

So what does this mean?

If a party (here, the ICO) wishes to appeal a judgment by the First Tier Tribunal (FTT) to the next level Upper Tribunal (UT), they must first make an application to the FTT itself, which must decide “as soon as practicable” whether to grant permission to appeal its own judgment (rules 42 and 43 of the Tribunal Procedure (First-tier Tribunal) (General Regulatory Chamber) Rules 2009).

If the FTT refuses permission to appeal (as has happened here), the application may be “renewed” (i.e. made again) directly to the UT itself (rule 21(2) of the Tribunal Procedure (Upper Tribunal) Rules 2008).

So, here, after 11 months (“as soon as reasonably practicable”?) the ICO has just had its initial application refused, and is now going to make an applicant under rule 21(2) of the UT Rules.

The ICO’s wording in its statement is slightly odd though: it talks of “having now received notification” that the FTT “refused” (not, say, “has now refused”) the November 2023 application. The tense used half implies that the refusal happened at the time and they’ve only just been told. If so, something must have gone badly wrong at the Tribunal.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, GDPR, Information Commissioner, Information Tribunal, judgments, Upper Tribunal

Data (Use and Access) Bill – some initial thoughts

By me, on the Mishcon de Reya website.

Leave a comment

Filed under Data Protection, Data Protection Bill, Information Commissioner, Open Justice, ROPA, subject access

Harassment of terrorism victims

[reposted from LinkedIn]

It is impossible to imagine claimants with whom one has more sympathy than Martin Hibbert and his daughter Eve, who each suffered grave, life-changing injuries in the 2017 Manchester Arena attack, and who then found themselves targeted by the bizarre and ghoulish actions of Richard Hall, a “conspiracy theorist” who has claimed the attack was in fact a hoax.

Martin and Eve brought claims in harassment and data protection against Hall, and, in a typically meticulous judgment Mrs Justice Steyn DBE yesterday gave judgment comprehensively in their favour on liability in the harassment claim. Further submissions are now invited on remedies.

The data protection claim probably adds nothing, but for those pleading and defending such claims it is worth reading Steyn J’s (mild) criticisms of the flaws, on both sides, at paragraphs 246-261. She has also invited further submissions on the data protection claim, although one wonders if it will be pursued.

Other than that, though, one hopes this case consigns Hall to the dustbin of history.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, UK GDPR