Category Archives: UK GDPR

“Consent” must be assessed objectively, says Court of Appeal

The Court of Appeal has handed down an important judgment (RTM v Bonne Terre Ltd & Anor [2026] EWCA Civ 488) on the meaning of “consent” in the context of data protection and ePrivacy law, and overturned what had been a problematic prior judgment of the High Court, which had left many businesses, especially those in the betting and gaming sector, facing an “ineradicable” risk of claims that potentially they could not reasonably have defended. The Court of Appeal’s judgment will no doubt be seen by those businesses as a welcome reversal, providing greater legal certainty.

So what does “consent” mean, in the data protection statutory scheme?

Article 4(11) of the UK GDPR says it means

any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”

And Article 7 puts the onus on the data controller to prove that the standard has been met.

Section 2(1) of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”, which deal with the sending of direct electronic marketing to individuals and the use of cookies and similar technologies), adopts the Article 4(11) UK GDPR definition (and applies it to “subscribers” and “users” as opposed to data subjects).

So far, so straightforward. But what happens in the case of someone who purports, by a clear affirmative action, such as ticking a box, to give specific, informed and unambiguous indication of his wishes, signifies agreement to the processing of personal data relating to him, and to the receiving of direct electronic marketing, but who later argues that the consent was vitiated by factors of which the data controller/sender of the marketing was unaware, and could not reasonably have been aware. Put another way, is the sending of direct electronic marketing on the basis of the objectively valid consent of someone who was subjectively incapable of giving valid consent, to be treated as lawfully sent?

“No”, said the High Court in the first instance. RTM was someone who, in his own submission, had gambled in circumstances, and to a degree he described as, “compulsive, out of control and destructive”, and claimed, in data protection and in misuse of private information, for damages, on the basis that, as he argued, Bonne Terre (operating as Sky Betting and Gaming, or “SBG”)

gathered and used extensive information, generated by his use of its platforms, unlawfully…especially by way of personalised and targeted marketing which he could not handle and which fed his compulsive behaviour

Mrs Justice Collins Rice DBE, had held, in her judgment, (even though RTM had not pleaded in these terms) that even though RTM had not lacked capacity to consent, and “that he wanted the direct marketing material – even perhaps craved it” he was one of a small subset (an “irreducible minimum”) of “individuals for whom decision-making…was already out of control in relation to gambling, and for whom the consenting mechanisms and information provision meant nothing other than barriers to gambling to be overcome”. Even though SBG had adopted controls in line with gambling regulatory requirements and expectations to avoid the risk of marketing to “problem gamblers” (the judge’s words) and even though these controls “can and do help manage and minimise the particular risks of direct marketing to online gamblers…they cannot and do not eliminate them”. This was because he “lacked subjective consent“; “the autonomous quality of his consenting behaviour was impaired to a real degree“; and “the quality of [his] consenting was rather lower than the standard required“, and “insufficiently freely given“.

The first instance judgment had presented all businesses, but especially those in the betting and gaming sector, with a problem and a risk: i) how could they establish in each case the subjective aspect of a data subject’s consent? and ii) if they could not establish that subjective aspect, how could they deal with the risk that marketing which would on the face of it be lawfully sent, would be held not to be, if the recipient was one of the irreducible minimum whose consent was not, subjectively, valid? Perhaps unsatisfactorily, the judge had said that this was

a risk which is ultimately ineradicable. Problem gamblers may not always be easy to recognise, and there will always be relevant information about them which is ultimately unreachable by the provider, and properly so because it is information which is itself in the private domain

The Court of Appeal has now roundly overturned the decision. Giving the main judgment, Lord Justice Warby revisited what a data controller must be able to demonstrate, in circumstances where consent is said to be present: the controller must “show that the data subject made a statement or took some other clear affirmative action…that ‘signifies agreement’”, they must also prove that “the data subject’s ‘indication’ met each of the four criteria prescribed by the legislation, namely that it was (i) freely given, (ii) specific, (iii) informed, and (iv) unambiguous”. All of these, he holds, are objective tests: “the data controller does not have to prove what was actually in the mind of the individual data subject at the time of the ‘indication’”.

In a classic example of judicial understatement, Warby LJ noted that the effect of the decision of the judge below was to establish a “principle that decisions deliberately made by a capacitous individual may nonetheless be vitiated for lack of consent” and further noted that it was a “legally novel” principle, whose “contours are not clear to me”.

Recitals 4 and 7 of the UK GDPR are relevant here. The first reminds us that

The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights

and the second reminds us that

Legal and practical certainty for natural persons, economic operators and public authorities should be enhanced

As Warby LJ notes, an “inevitable corollary” of the original ruling would be that a business “could not guarantee its ability to ‘demonstrate’ conformity with the consent requirements of data protection law and PECR”, and

the unsatisfactory and ultimately opaque nature of the test for legally effective consent which the judge applied…would create considerable legal and practical uncertainty for economic operators

Absent a further appeal by RTM, which would need to be to the Supreme Court, and which would seem unlikely, the Court of Appeal has now gone a long way towards restoring legal and practical certainty as to the meaning of “consent” in data protection law, and how data controllers should approach the task of gathering and proving consent.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under consent, controller, cookies, Data Protection, judgments, marketing, PECR, UK GDPR

Extension to right to erasure

The Victims and Prisoners Act 2024 (Commencement No. 10) and Data (Use and Access) Act 2025 (Commencement No. 8) Regulations 2026 were made on 31 March 2026, bringing into effect an important amendment to the UK GDPR right to erasure.

In May 2024 Parliament enacted section 31 of the Victims and Prisoners Act 2024. Section 31 inserts a new Article 17(1)(g) into the UK GDPR, which invokes the right to erasure in the case of certain unfounded malicious allegations, where:

the personal data have been processed as a result of an allegation about the data subject—
(i) which was made by a person who is a malicious person in relation to the data subject (whether they became such a person before or after the allegation was made),
(ii) which has been investigated by the controller, and
(iii) in relation to which the controller has decided that no further action is to be taken

New Article 17(4) defines a “malicious person” as one who has been convicted of a specified offence or who is subject to a stalking protection order.

At the same time para 32 of Schedule 11 of the Data (Use and Access) Act, which extends the same provisions to Scotland and Northern Ireland, is also commenced.

The provisions were introduced to the 2024 Act by way of an amendment by Stella Creasy MP, informed in part by her own experiences of an entirely false and malicious allegation, and difficulties with expunging records of it (see here).

Leave a comment

Filed under accuracy, Data (Use and Access) Act, Data Protection, erasure, Legislation, UK GDPR

DUAA commencement – what’s hot and what’s not

I’ve written for the Mishcon de Reya website on the commencement on 5 February of the majority of the data protection and eprivacy provisions of the Data (Use and Access) Act 2025: 

https://www.mishcon.com/news/data-protection-and-electronic-privacy-reform-whats-hot-and-whats-not

Leave a comment

Filed under charities, Data (Use and Access) Act, Data Protection, Data Protection Act 2018, marketing, PECR, UK GDPR

CoA: County Court is appropriate forum for routine data protection claim

This is a helpful short Court of Appeal judgment on the appropriate forum for a data protection of relatively low value and limited complexity (spoiler: it’s the County Court, folks).

The claimant had originally incorrectly issued his claim as a High Court media and communications claim in the Cardiff District Registry (if data protection claims are to be issued in the High Court, they must be issued in the King’s Bench Division at the Royal Courts of Justice). The judge in the High Court in Cardiff transferred the claim to the County Court but his order arguably contained insufficient reasons, and did not explain that either party could apply to have it set aside or varied (as required by CPR 3.3(5)(b). The claimant tried to make representations, by way of an email, as to why the High Court was the appropriate forum, but this was rejected on the basis that it had been filed in the wrong court. By that stage, the transfer to the County Court had taken effect. Accordingly, the matters arising could only be determined by way of appeal.

In its determination, the CoA found that the case (involving disclosure, in separate proceedings, of medical information by a court security guard to an usher and a solicitor for a third party) did not appear to involve any factual or legal complexity, and the claimed sum of £30,000 was clearly within the ambit of the County Court.

(I interject here to observe that, on the brief facts as recorded in the judgment, there might have been some legal complexity – it seems likely that the disclosure would have been made orally by the security guard, so was there “processing” involved?)

Wysokinski v OCS Security Ltd [2026] EWCA Civ 26 

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, judgments, UK GDPR

NCND for personal data – a qualified exemption?

[reposted from my LinkedIn Account]

I’ve been known to criticise First-tier Tribunal (FTT) judgments in the freedom of information jurisdiction. By contrast, this one is superb.

In it, the FTT dismantle the argument (and the decision notice) of the Information Commissioner’s Office that Bolton NHS Foundation Trust were entitled to “neither confirm nor deny” (NCND) holding reviews, including a review by PWC, into the Trust’s governance and management. The PWC review was the subject of an article in the Health Service Journal, and the requester was the journalist, Lawrence Dunhill.

Firstly, the FTT noted that the ICO “case begins with an elementary error of fact. It treats the Trust as having given an NCND response to the entirety of the Request when it did no such thing” (the Trust had only applied NCND in respect of the request for a PWC report, but had confirmed it held other reviews). Oddly, the Trust, in its submissions for the appeal, simply ignored this error (the FTT chose not to speculate on “whether that omission was accidental or tactical”).

Secondly, and notably, the FTT found a fundamental error of law in the ICO’s approach (and, by implication, in its guidance) to NCND in the context of personal data. Section 2(3)(fa) of FOIA provides that section 40(2) is an absolute exemption (therefore not subject to a public interest test). But section 2(3) does not include section 40(5B) (the personal data NCND provision) in the list of absolute exemptions. As far as I know, the ICO has always taken the view, however, that it is an absolute exemption – certainly its current guidance says this).

That approach, held the FTT, is “simply wrong…the exemption under FOIA, s40(5B)(a)(i) is qualified and the public interest balancing test applies”. And but for that error, they said, the ICO might have reached a different conclusion.

As it was, the FTT held that the legitimate interests balancing test under Article 6(1)(f) of the UK GDPR was sufficient to determine the issue: merely confirming or denying whether the PWC review was held would not cause unwarranted prejudice to a named individual when balanced against the requester’s legitimate interests.

It will be interesting to see if the ICO appeal this. Given the strength of the criticism it would perhaps be bold to do so, but it might be that the only alternative will be to have to rewrite their guidance on s40(5), and rethink their long-held view on it.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, FOIA, Freedom of Information, Information Commissioner, Information Tribunal, judgments, NCND, UK GDPR

ICO fines: are you certain?

In his inaugural speech as Information Commissioner, in 2022, John Edwards said

my focus is on bringing certainty in what the law requires of you and your organisations, and in how the regulator acts

It’s a message he’s sought to convey on many occasions since. No surprise: it’s one of the Commissioner’s tasks under the Regulators’ Code to

improve confidence in compliance for those they regulate, by providing greater certainty

This isn’t the place or the time for a broad analysis of how well the ICO has measured up to those standards, but I want to look at one particular example of where there appears to be some uncertainty.

In March 2024, the ICO fined the Central YMCA £7500 for serious contraventions of the UK GDPR. In announcing the fine, the ICO said that it would have been £300,000 but that “this was subsequently reduced in line with the ICO’s public sector approach” (the policy decision whereby “fines for public sector bodies are reduced where appropriate”). When questioned why a charity benefited from the public sector approach, the ICO stated that

Central YMCA is a charity that does a lot of good work, they engaged with us in good faith after the incident happened, recognised their mistake immediately and have made amends to their processing activities…the fine is in line with the spirit of our public sector approach

So the charity sector might have reasonably drawn from this that, in the event that another charity doing a “lot of good work” seriously contravened the UK GDPR, but engaged in good faith with the ICO and made amends to its processing activities, it would also benefit from the public sector approach, with a similar reduction of around 97.5% in any fine.

However, on 28 July, the Scottish charity Birthlink was fined £18,000 by the ICO for serious contraventions of the UK GDPR but the ICO did not apply the public sector approach. When I questioned why, the answer merely confirmed that it had not been applied, but that they had applied their Fining Guidance. Admittedly, Birthlink did not recognise the seriousness of its contraventions for around two years, but that was not mentioned in the ICO’s answer.

I was also referred to the consultation on continuing the public sector approach, which ran earlier this year. That consultation explained that the proposal was not to apply the public sector approach to charities in the future, because the ICO would have regard to the definition of “public authority” and “public body” at section 7 of the Data Protection Act 2018, which, for obvious reasons, doesn’t include charities.

However, the outcome of that consultation has not been announced yet, and the ICO site says

In the meantime, we will continue to apply the approach outlined by the Commissioner in his June 2022 open letter.

As that current approach is the one under which the ICO applied great leniency to the Central YMCA, the question therefore remains – why did Birthlink not also benefit from it?

And there’s a wider question: the definition of a public body/authority at section 7 of the Data Protection Act 2018 has been in effect since 2018. Why did the ICO think, in 2024, that section 7 was not relevant, and that a (wealthy) charity should qualify for the public sector approach, but then decide that another (much less wealthy) charity shouldn’t, when facing a fine only a few months later?

The answers are far from certain.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consistency, Data Protection Act 2018, fines, Information Commissioner, monetary penalty notice, UK GDPR

Data Protection risks to life: Should more be done?

I’ve written up my thoughts for the Mishcon de Reya website, on the baffling decision by the ICO to take no action in response to the most catastrophic data breach in UK history, which exposed many thousands of people to immediate risk to their lives.

https://www.mishcon.com/news/data-protection-risks-to-life-should-more-be-done

Leave a comment

Filed under Data Protection, Data Protection Act 2018, data sharing, Information Commissioner, Ministry of Defence, UK GDPR

What the DUAA 2025 will do

Section 1(2) of the Data Protection Act 2018 tells us that

Most processing of personal data is subject to the UK GDPR

Despite the attention given to the progress of the Data (Use and Access) Act 2025 (and I have certainly given it a lot), now that it has passed, its significance for data protection practitioners is essentially only in how it will amend the three core legislative instruments relevant to their practice area: the UK GDPR, the DPA 2018, and PECR.

The DUAA is (in data protection law terms) mostly an amending statute: once its provisions have commenced, their relevance lies in how they amend those three core texts.

How that amending is done in practice is important to note.

When a piece of legislation is amended, Parliament doesn’t reenact it, so the “official” printed version remains. In pre-internet days this meant that practitioners had to read the original instrument, and the amending instrument, side by side, and note what changes applied. This was generally done with the assistance of legal publishers, who might print “consolidated” versions of the original instrument with, effectively, the amendments showing in mark-up.

In the internet age, things actually haven’t changed in substance, but it’s very much easier to read the consolidated versions. If, for example, you go to the legislation.gov.uk website, and look at the DPA 2018, you can view it in “Original (as enacted)” version, and “Latest available” version (in the second image below, for instance, you can see that “GDPR” was amended to “UK GDPR”, with the footnote explaining that this was effected by
The Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019)).

The DUAA has not been published yet (and remember that many of its provisions won’t come into immediate effect, but will require secondary legislation to “commence” them into effect), but once it is, and once the clever people who maintain the legislation website have done their thing, most practitioners won’t need to refer to the DUAA: they should, instead, refer to the newly amended, consolidated versions of the UK GDPR, the DPA 2018 and PECR.

And also remember, “Most processing of personal data is [still] subject to the UK GDPR”.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data (Use and Access) Act, Data (Use and Access) Bill, Data Protection, Legislation, UK GDPR

Defamation rules are applied to UK GDPR claim

An interesting recent judgment in the High Court considers the extent to which rules in defamation law might also apply to data protection claims.

In July 2024 His Honour Judge Lewis struck out a claim in defamation brought by Dale Vince against Associated Newspapers. The claim arose from a publication in the Daily Mail (and through the Mail+ app). The article reported that the Labour Party had returned a £100,000 donation made by another person, who was said to be “a high-flying City financier accused of sex harassment”, but also said that the claimant had donated £1.5m to the Labour Party, but then caused the Party embarrassment by joining an “eco-protest” in London, which had blocked traffic around Parliament Square. The article had the headline “Labour repays £100,000 to ‘sex harassment’ donor”, followed by eleven paragraphs of text, two photographs of the claimant and the caption “Road blockers: Dale Vince in London yesterday, and circled as he holds up traffic with Just Stop Oil”.

The strike-out succeeded on the basis that a claim in libel “may not be founded on a headline, or on headlines plus photographs and captions, in isolation from the related text, and it is impermissible to carve the readership into different groups, those who read only headlines (or headlines and captions) and those who read the whole article”, following the rule(s) in Charleston v News Group Newspapers Ltd [1995] 2 AC 65 (the wording quoted is from the defendant’s strike-out application). When the full article was read, as the claimant conceded, the ordinary reader would appreciate very quickly that he was not the person being accused of sexual harassment.

A subsequent claim by Mr Vince, in data protection, under the UK GDPR, has now also been struck out (Vince v Associated Newspapers  [2025] EWHC 1411 (KB)). This time, the strike out succeeded on the basis that, although the UK GDPR claim was issued (although not served) prior to the handing down of judgment in the defamation claim, Mr Vince not only could, but should have brought it earlier:

There was every reason why the UKGDPR and defamation claims should have been brought in the same proceedings. Both claims arose out of the same event – the publication of the article in Mail+ and the Daily Mail. Both claims rely on the same factual circumstances, namely the juxtaposition of the headline, photographs and caption, and the contention that the combination of the headline and the photograph created the misleading impression that Mr Vince had been accused of sexual harassment. In one claim this was said to be defamatory, in the other the misleading impression created was said to comprise unfair processing of personal data

This new claim was, said Mr Justice Swift, an abuse of process – a course which would serve only “to use the court’s process in a way that is unnecessary and is oppressive to Associated Newspapers”.

Additionally, the judge would have granted Associated Newspapers’ application for summary judgment, on the grounds that the rule in Charleston would have applied to the data protection claim as it had to the defamation claim:

in the context of this claim where the processing relied on takes the form of publication, the unfairness relied on is that a headline and photographs gave a misleading impression, and the primary harmed caused is said to be reputational damage, the law would be incoherent if the fairness of the processing was assessed other than by considering the entirety of what was published

This last point, although, strictly, obiter, is an important one: where a claim of unfair processing, by way of publication of personal data, is brought in data protection, the courts are likely to demand that the entirety of what was published be considered, and not just personal data (or parts of personal data) in isolation.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, defamation, fairness, judgments, UK GDPR

Good Law Project v Reform

In the run-up to last year’s General Election, the campaigning group The Good Law Project (GLP) actively encouraged people to make subject access requests (under Article 15 of the UK GDPR) to political parties, and they say that they enabled 13,000 people to do so.

The GLP says that the Reform Party “replied to hardly anyone”, and as a result it is bringing the first ever case in the UK under Article 80(1) of the UK GDPR, whereby a data subject (or subjects) mandates an representative organisation to bring an Article 79 claim on their behalf.

Helpfully, the GLP has published both its own particulars of claim, and, now, Reform’s defence to the claim. The latter is particularly interesting, as its initial approach is to threaten to apply to strike out the claim on the grounds that the GLP does not meet the criteria for a representative body, as laid out in section 187 of the Data Protection Act 2018.

Given the nature of the two parties (one a bullish campaign group, the other a bullish political party) it seems quite likely that this will proceed to trial. If so, we should get some helpful clarification on how Article 80(1) should operate.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Article 80, Data Protection Act 2018, political parties, UK GDPR