Tag Archives: UK GDPR

ICO fines: are you certain?

In his inaugural speech as Information Commissioner, in 2022, John Edwards said

my focus is on bringing certainty in what the law requires of you and your organisations, and in how the regulator acts

It’s a message he’s sought to convey on many occasions since. No surprise: it’s one of the Commissioner’s tasks under the Regulators’ Code to

improve confidence in compliance for those they regulate, by providing greater certainty

This isn’t the place or the time for a broad analysis of how well the ICO has measured up to those standards, but I want to look at one particular example of where there appears to be some uncertainty.

In March 2024, the ICO fined the Central YMCA £7500 for serious contraventions of the UK GDPR. In announcing the fine, the ICO said that it would have been £300,000 but that “this was subsequently reduced in line with the ICO’s public sector approach” (the policy decision whereby “fines for public sector bodies are reduced where appropriate”). When questioned why a charity benefited from the public sector approach, the ICO stated that

Central YMCA is a charity that does a lot of good work, they engaged with us in good faith after the incident happened, recognised their mistake immediately and have made amends to their processing activities…the fine is in line with the spirit of our public sector approach

So the charity sector might have reasonably drawn from this that, in the event that another charity doing a “lot of good work” seriously contravened the UK GDPR, but engaged in good faith with the ICO and made amends to its processing activities, it would also benefit from the public sector approach, with a similar reduction of around 97.5% in any fine.

However, on 28 July, the Scottish charity Birthlink was fined £18,000 by the ICO for serious contraventions of the UK GDPR but the ICO did not apply the public sector approach. When I questioned why, the answer merely confirmed that it had not been applied, but that they had applied their Fining Guidance. Admittedly, Birthlink did not recognise the seriousness of its contraventions for around two years, but that was not mentioned in the ICO’s answer.

I was also referred to the consultation on continuing the public sector approach, which ran earlier this year. That consultation explained that the proposal was not to apply the public sector approach to charities in the future, because the ICO would have regard to the definition of “public authority” and “public body” at section 7 of the Data Protection Act 2018, which, for obvious reasons, doesn’t include charities.

However, the outcome of that consultation has not been announced yet, and the ICO site says

In the meantime, we will continue to apply the approach outlined by the Commissioner in his June 2022 open letter.

As that current approach is the one under which the ICO applied great leniency to the Central YMCA, the question therefore remains – why did Birthlink not also benefit from it?

And there’s a wider question: the definition of a public body/authority at section 7 of the Data Protection Act 2018 has been in effect since 2018. Why did the ICO think, in 2024, that section 7 was not relevant, and that a (wealthy) charity should qualify for the public sector approach, but then decide that another (much less wealthy) charity shouldn’t, when facing a fine only a few months later?

The answers are far from certain.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consistency, Data Protection Act 2018, fines, Information Commissioner, monetary penalty notice, UK GDPR

What the DUAA 2025 will do

Section 1(2) of the Data Protection Act 2018 tells us that

Most processing of personal data is subject to the UK GDPR

Despite the attention given to the progress of the Data (Use and Access) Act 2025 (and I have certainly given it a lot), now that it has passed, its significance for data protection practitioners is essentially only in how it will amend the three core legislative instruments relevant to their practice area: the UK GDPR, the DPA 2018, and PECR.

The DUAA is (in data protection law terms) mostly an amending statute: once its provisions have commenced, their relevance lies in how they amend those three core texts.

How that amending is done in practice is important to note.

When a piece of legislation is amended, Parliament doesn’t reenact it, so the “official” printed version remains. In pre-internet days this meant that practitioners had to read the original instrument, and the amending instrument, side by side, and note what changes applied. This was generally done with the assistance of legal publishers, who might print “consolidated” versions of the original instrument with, effectively, the amendments showing in mark-up.

In the internet age, things actually haven’t changed in substance, but it’s very much easier to read the consolidated versions. If, for example, you go to the legislation.gov.uk website, and look at the DPA 2018, you can view it in “Original (as enacted)” version, and “Latest available” version (in the second image below, for instance, you can see that “GDPR” was amended to “UK GDPR”, with the footnote explaining that this was effected by
The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019)).

The DUAA has not been published yet (and remember that many of its provisions won’t come into immediate effect, but will require secondary legislation to “commence” them into effect), but once it is, and once the clever people who maintain the legislation website have done their thing, most practitioners won’t need to refer to the DUAA: they should, instead, refer to the newly amended, consolidated versions of the UK GDPR, the DPA 2018 and PECR.

And also remember, “Most processing of personal data is [still] subject to the UK GDPR”.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data (Use and Access) Act, Data (Use and Access) Bill, Data Protection, Legislation, UK GDPR

Defamation rules are applied to UK GDPR claim

An interesting recent judgment in the High Court considers the extent to which rules in defamation law might also apply to data protection claims.

In July 2024 His Honour Judge Lewis struck out a claim in defamation brought by Dale Vince against Associated Newspapers. The claim arose from a publication in the Daily Mail (and through the Mail+ app). The article reported that the Labour Party had returned a £100,000 donation made by another person, who was said to be “a high-flying City financier accused of sex harassment”, but also said that the claimant had donated £1.5m to the Labour Party, but then caused the Party embarrassment by joining an “eco-protest” in London, which had blocked traffic around Parliament Square. The article had the headline “Labour repays £100,000 to ‘sex harassment’ donor”, followed by eleven paragraphs of text, two photographs of the claimant and the caption “Road blockers: Dale Vince in London yesterday, and circled as he holds up traffic with Just Stop Oil”.

The strike-out succeeded on the basis that a claim in libel “may not be founded on a headline, or on headlines plus photographs and captions, in isolation from the related text, and it is impermissible to carve the readership into different groups, those who read only headlines (or headlines and captions) and those who read the whole article”, following the rule(s) in Charleston v News Group Newspapers Ltd [1995] 2 AC 65 (the wording quoted is from the defendant’s strike-out application). When the full article was read, as the claimant conceded, the ordinary reader would appreciate very quickly that he was not the person being accused of sexual harassment.

A subsequent claim by Mr Vince, in data protection, under the UK GDPR, has now also been struck out (Vince v Associated Newspapers  [2025] EWHC 1411 (KB)). This time, the strike out succeeded on the basis that, although the UK GDPR claim was issued (although not served) prior to the handing down of judgment in the defamation claim, Mr Vince not only could, but should have brought it earlier:

There was every reason why the UKGDPR and defamation claims should have been brought in the same proceedings. Both claims arose out of the same event – the publication of the article in Mail+ and the Daily Mail. Both claims rely on the same factual circumstances, namely the juxtaposition of the headline, photographs and caption, and the contention that the combination of the headline and the photograph created the misleading impression that Mr Vince had been accused of sexual harassment. In one claim this was said to be defamatory, in the other the misleading impression created was said to comprise unfair processing of personal data

This new claim was, said Mr Justice Swift, an abuse of process – a course which would serve only “to use the court’s process in a way that is unnecessary and is oppressive to Associated Newspapers”.

Additionally, the judge would have granted Associated Newspapers’ application for summary judgment, on the grounds that the rule in Charleston would have applied to the data protection claim as it had to the defamation claim:

in the context of this claim where the processing relied on takes the form of publication, the unfairness relied on is that a headline and photographs gave a misleading impression, and the primary harmed caused is said to be reputational damage, the law would be incoherent if the fairness of the processing was assessed other than by considering the entirety of what was published

This last point, although, strictly, obiter, is an important one: where a claim of unfair processing, by way of publication of personal data, is brought in data protection, the courts are likely to demand that the entirety of what was published be considered, and not just personal data (or parts of personal data) in isolation.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, defamation, fairness, judgments, UK GDPR

Good Law Project v Reform

In the run-up to last year’s General Election, the campaigning group The Good Law Project (GLP) actively encouraged people to make subject access requests (under Article 15 of the UK GDPR) to political parties, and they say that they enabled 13,000 people to do so.

The GLP says that the Reform Party “replied to hardly anyone”, and as a result it is bringing the first ever case in the UK under Article 80(1) of the UK GDPR, whereby a data subject (or subjects) mandates an representative organisation to bring an Article 79 claim on their behalf.

Helpfully, the GLP has published both its own particulars of claim, and, now, Reform’s defence to the claim. The latter is particularly interesting, as its initial approach is to threaten to apply to strike out the claim on the grounds that the GLP does not meet the criteria for a representative body, as laid out in section 187 of the Data Protection Act 2018.

Given the nature of the two parties (one a bullish campaign group, the other a bullish political party) it seems quite likely that this will proceed to trial. If so, we should get some helpful clarification on how Article 80(1) should operate.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Article 80, Data Protection Act 2018, political parties, UK GDPR

O’Carroll v Meta – what now for targeted adverts on Facebook

Following the news that claimant Tanya O’Carroll and defendant Meta have settled ahead of what was likely to be a landmark data protection case, what are the implications?

Ms O’Carroll argued that advertising served to her on Facebook, because it was targeted at her, met the definition of “direct marketing” under section 122(5) of the Data Protection Act 2018 (“the communication (by whatever means) of advertising or marketing material which is directed to particular individuals”) and thus the processing of her personal data for the purposes of serving that direct marketing was subject to the absolute right to object under Article 21(2) and (3) UK GDPR.

Meta had disputed that the advertising was direct marketing.

The “mutually agreed statement” from Ms O’Carroll says “In agreeing to conclude the case, Meta Platforms, Inc. has agreed that it will not display any direct marketing ads to me on Facebook, will not process my data for direct marketing purposes and will not undertake such processing (including any profiling) to the extent it is related to such direct marketing”.

One concludes from this that Meta will, at least insofar as the UK GDPR applies to its processing, now comply with any Article 21(2) objection, and, indeed, that is how it is being reported.

But will the upshot of this be that Meta will introduce ad-free services in the UK, but for a charge (because its advertising revenues will be likely to drop if people object to targeted ads)? It is indicating so, with a statement saying “Facebook and Instagram cost a significant amount of money to build and maintain, and these services are free for British consumers because of personalised advertising. Like many internet services, we are exploring the option of offering people based in the UK a subscription and will share further information in due course”.

The ICO intervened in the case, and have uploaded a summary of their arguments, which were supportive of Ms O’Carroll’s case, and her lawyers AWO Agency have also posted an article on the news.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, facebook, Information Commissioner, marketing, Meta, Right to object, UK GDPR

Consent is not the only basis

In 2017 I attended a free event run by a “GDPR consultancy”. The presenter confidently told us that we were going to have to get consent from customers in order to process their personal data. One attendee said they worked at the DWP, so how were they going to get consent from benefits claimants who didn’t want to disclose their income, to which the presenter rather awkwardly said “I think that’s one you’ll have to discuss with your lawyers”. Another attendee, who was now most irritated that he’d taken time out from work for this, could hold his thoughts in no longer, and rudely announced that this was complete nonsense.

That attendee was the – much ruder in those days – 2017 version of me.

I never imagined (although I probably should have done) that eight years on the same nonsense would still be spouted.

Just as the Data Protection Act 2018 did not implement the GDPR in the UK (despite the embarrassing government page that until recently, despite people raising it countless times, said so) just as the GDPR does not limit its protections to “EU citizens”, so GDPR and the UK GDPR do not require consent for all processing.

Anyone who says so has not applied a smidgeon of thought or research to the question, and is probably taking content from generative AI, which, on the time-honoured principle of garbage-in, garbage-out, has been in part trained on the existing nonsense. To realise why it’s garbage, they should just start with the DWP example above and work outwards from there.

Consent is one of the six lawful bases, any one or more of which can justify processing. No one basis is better than or takes precedence over the other.

To those who know this, I apologise for having to write it down, but I want to have a sign to tap for any time I see someone amplifying the garbage on LinkedIn.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, DWP, GDPR, Let's Blame Data Protection, UK GDPR

Can directors and trustees of charities be controllers?

[reposted from LinkedIn]

Savva v Leather Inside Out & Ors [2024] EWHC 2867 (KB), Sam Jacobs of Doughty Street Chambers, instructed by Forsters LLP for the defendants (the applicant in the instant application)

Is it the case that a director or trustee of a charity (which is a controller) cannot be a controller? That, in effect, was one of the grounds of an application by two defendants to strike out and grant summary judgment in a claim arising from alleged failures to comply with subject access requests.

The claim arises from a dispute between the claimant, a former prisoner, employed by a subsidiary of a charity (“Leather Inside Out” – currently in administration), and the charity itself. The claim is advanced against the charity, but also against the charity’s founder and two trustees, who are said on the claim form to be controllers of the claimant’s data, in addition to, or jointly with, the charity.

In a solid judgment, Deputy Master Alleyne refused to accept that such natural persons were not capable of being a controller: the term is given a broad definition in Article 4(7) UK GDPR, and “includes a natural or legal person, public authority, agency or other body and that there may be joint controllers. On plain reading of the provisions, it is incorrect to suggest that an allegation of joint controllers is, per se, not a legally recognisable claim” (re Southern Pacific Loans applied).

However, on the specific facts of this case, the pleading of the claimant (the respondent to the strike out application) failed “to allege any decisions or acts in respect of personal data which were outside the authority of the trustees as agents for [the charity]…the Respondent’s submissions demonstrated he wrongly conflated the immutable fact that a legal person must have a natural person through whom its decisions are carried into effect, with his case that the natural person must be assuming the defined status of data controller in their personal capacity”. That was not the case here – the founder and the trustees had not acted other than as agents for the charity.

Accordingly, the strike out application succeeded (notably, though, there Deputy Master said he had reached his conclusion
“not without some caution”).

Assuming the claim goes forward to trial, therefore, it can only be advanced against the charity, as sole controller.


The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under charities, controller, Data Protection, judgments, subject access, UK GDPR

Pacini & Geyer v Dow Jones – at the interface between libel and data protection

[reposted from LinkedIn]

This is an important judgment on preliminary issues (the second preliminary issues judgment in the case – the first was on an unsuccessful strike out application by the defendants) in a data protection claim brought by two businessmen against Dow Jones, in relation to articles in the Wall Street Journal in 2017 and 2018. The claim is for damages and for erasure of personal data which is said to be inaccurate.

It is believed to be the first time in a data protection claim that a court has been required to determine the meaning of personal data as a preliminary issue in an accuracy claim.

Determination of meaning is, of course, something that is common in defamation claims. The judgment is a fascinating, but complex, analysis of the parallels between determining the meaning of personal data in a publication and determining the meaning of allegedly defamatory statements in a publication. Although the judge is wary of importing rules of defamation law, such as the “single meaning rule” and “repetition rule” a key part of the discussion is taken up by them.

The single meaning rule, whereby “the court must identify the single meaning of a publication by reference to the response of the ordinary reader to the entire publication” (NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB)) is potentially problematic in a data protection claim such as this where the claimants argue that it is not the ordinary reader they are concerned about, but a reader who might be a potential business investor.

Similarly, it is not at all clear that the repetition rule, which broadly seeks to avoid a defamatory loophole by which someone argues “but I’m only reporting what someone else said – their words might be defamatory, but mine merely report the fact that they said them”, should carry over to data protection claims, not least because what will matter in defamation claims is the factual matrix at the time of publication, whereas with data protection claims “a claim for inaccuracy may be made on the basis that personal data are inaccurate at the time of the processing complained of, including because they have become misleading or out of date, regardless of whether they were accurate at the time of original publication. In that event, what matters is the factual matrix at the time when relief is sought” (at 66).

Nonetheless, and in a leap I can’t quite follow on first of the judgment, but which seems to be on the basis that the potential problems raised can be addressed at trial when fairness of processing (rather than accuracy) arises, the judge decides to determine meaning on a single meaning/repetition rule basis (at 82-84).

There’s a huge amount to take in though, and the judgment demands close reading (and re-reading). If a full trial and judgment ensue, the case will probably be a landmark one.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under accuracy, Data Protection, Data Protection Act 2018, judgments, UK GDPR

Harassment of terrorism victims

[reposted from LinkedIn]

It is impossible to imagine claimants with whom one has more sympathy than Martin Hibbert and his daughter Eve, who each suffered grave, life-changing injuries in the 2017 Manchester Arena attack, and who then found themselves targeted by the bizarre and ghoulish actions of Richard Hall, a “conspiracy theorist” who has claimed the attack was in fact a hoax.

Martin and Eve brought claims in harassment and data protection against Hall, and, in a typically meticulous judgment Mrs Justice Steyn DBE yesterday gave judgment comprehensively in their favour on liability in the harassment claim. Further submissions are now invited on remedies.

The data protection claim probably adds nothing, but for those pleading and defending such claims it is worth reading Steyn J’s (mild) criticisms of the flaws, on both sides, at paragraphs 246-261. She has also invited further submissions on the data protection claim, although one wonders if it will be pursued.

Other than that, though, one hopes this case consigns Hall to the dustbin of history.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, UK GDPR

Crowdstrike and personal data breaches: loss vs unavailability

I ran a poll on LinkedIn in recent days which asked “If a controller temporarily can’t access personal data on its systems because of the Crowdstrike/MSFT incident is it a personal data breach?” 

I worded the question carefully.

50% of the 100-odd people who voted said “no” and 50% said “yes”. The latter group are wrong. I say this with some trepidation because there are people in that group whose opinion I greatly respect. 

But here’s why they, and, indeed, the Information Commissioner’s Office and the European Data Protection Board, are wrong.

Article 4(12) of the GDPR/UK GDPR defines a “personal data breach”. This means that it is a thing in itself. And that is why I try always to use the full term, or abbreviate it, as I will here, to “PDB”. 

This is about the law, and in law, words are important. To refer to a PDB as the single word “breach” is a potential cause of confusion, and both the ICO and the EDPB guidance are infected by and diminished by sloppy conflation of the terms “personal data breach” and “breach”. In English, at least, and in English law, the word “breach” will often be used to refer to a contravention of a legal obligation: a “breach of the law”. (And in information security terminology, a “breach” is generally used to refer to any sort of security breach.) But a “breach” is not coterminous with a “personal data breach”.

And a PDB is not a breach of the law: it is a neutral thing. It is also crucial to note that nowhere do the GDPR/UK GDPR say that there is an obligation on a person (whether controller or processor) not to experience a PDB, and nowhere do GDPR/UK GDPR create liability for failing to prevent one occurring. This does not mean that where a PDB has occurred because of an infringement of other provisions which do create obligations and do confer liability (primarily Article 5(1)(f) and Article 32) there is no potential liability. But not every PDB arises from an infringement of those provisions.

The Article 4(12) definition is “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”. Let us break that down:

  • A breach of security…
  • leading to [one or more of]
  • accidental or unlawful…
  • 1. destruction of…
  • 2. loss of…
  • 3. alteration of…
  • 4. unauthorised disclosure of…
  • 5. unauthorised access to…
  • personal data processed.

If an incident is not a breach of security, then it’s not a PDB. And if it is a breach of security but doesn’t involve personal data, it’s not a PDB. But even if it is a breach of security, and involves personal data, it’s only a PDB if one of the eventualities I’ve numbered 1 to 5 occurs.

Note that nowhere in 1 to 5 is there “unavailability of…” or “loss of access to…”. 

Now, both the ICO, and the EDPB, read into the words “loss of…personal data…” the meaning, or potential meaning “loss of availability of personal data”. But in both cases they appear to do so in the context of saying, in terms, “loss of availability is Article 4(12) ‘loss’ because it can cause harm to data subjects”. I don’t dispute, and nor will many millions of people affected by the Crowdstrike incident, that unavailability of personal data can cause harm. But to me, “loss” means loss: I had something, and I no longer have it. I believe that that is how a judge in the England and Wales courts would read the plain words of Article 4(12), and decide that if the legislator had intended “loss” to mean something more than the plain meaning of “loss” – so that it included a meaning of “temporary lack of access to” – then the legislator would have said so. 

Quite frankly, I believe the ICO and EDPB guidance are reading into the plain wording of the law a meaning which they would like to see, and they are straining that plain wording beyond what is permissible.

The reason, of course, that this has some importance is that Article 33 of the GDPR/UK GDPR provides that “in the case of” (note the neutral, “passive” language) a PDB, a controller must in general make a notification to the supervisory authority (which, in the UK, is the ICO), and Article 34 provides that where a PDB is likely to result in a high risk to the rights and freedoms of natural persons, those persons should be notified. If a PDB has not occurred, no obligation to make such notifications arises. That does not mean of course, that notifications cannot be made, through an exercise of discretion (let’s forget for the time being – because they silently resiled from the point – that the ICO once bizarrely and cruelly suggested that unnecessary Article 33 notifications might be a contravention of the GDPR accountability principle.)

It might well be that the actions or omissions leading to a PDB would constitute an infringement of Articles 5(1)(f) and 32, but if an incident does not meet the definition in Article 4(12), then it’s not a PDB, and no notification obligation arises. (Note that this is an analysis of the position under the GDPR/UK GDPR – I am not dealing with whether notification obligations to any other regulator arise.)

I can’t pretend I’m wholly comfortable saying to 50% of the data protection community, and to the ICO and EDPB, that they’re wrong on this point, but I’m comfortable that I have a good arguable position, and that it’s one that a judge would, on balance agree with. 

If I’m right, maybe the legislator of the GDPR/UK GDPR missed something, and maybe availability issues should be contained within the Article 4(12) definition. If so, there’s nothing to stop both the UK and the EU legislators amending Article 4(12) accordingly. And if I’m wrong, there’s nothing to stop them amending it to make it more clear. In the UK, in particular, with a new, energised government, a new Minister for Data Protection, and a legislative agenda that will include bills dealing with data issues, this would be relatively straightforward. Let’s see.

And I would not criticise any controller which decided it was appropriate to make an Article 33 notification. It might, on balance, be the prudent thing for some affected controllers to do so. The 50/50 split on my poll indicates the level of uncertainty on the part of the profession. One also suspects that the ICO and the EU supervisory authorities might get a lot of precautionary notifications.

Heck, I’ll say it – if anyone wants to instruct me and my firm to advise, both on law and on legal strategy – we would of course be delighted to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, EDPB, GDPR, Information Commissioner, Let's Blame Data Protection, LinkedIn Post, personal data breach, UK GDPR