Category Archives: Data Protection

“Consent” must be assessed objectively, says Court of Appeal

The Court of Appeal has handed down an important judgment (RTM v Bonne Terre Ltd & Anor [2026] EWCA Civ 488) on the meaning of “consent” in the context of data protection and ePrivacy law, and overturned what had been a problematic prior judgment of the High Court, which had left many businesses, especially those in the betting and gaming sector, facing an “ineradicable” risk of claims that potentially they could not reasonably have defended. The Court of Appeal’s judgment will no doubt be seen by those businesses as a welcome reversal, providing greater legal certainty.

So what does “consent” mean, in the data protection statutory scheme?

Article 4(11) of the UK GDPR says it means

any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”

And Article 7 puts the onus on the data controller to prove that the standard has been met.

Section 2(1) of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”, which deal with the sending of direct electronic marketing to individuals and the use of cookies and similar technologies), adopts the Article 4(11) UK GDPR definition (and applies it to “subscribers” and “users” as opposed to data subjects).

So far, so straightforward. But what happens in the case of someone who purports, by a clear affirmative action, such as ticking a box, to give specific, informed and unambiguous indication of his wishes, signifies agreement to the processing of personal data relating to him, and to the receiving of direct electronic marketing, but who later argues that the consent was vitiated by factors of which the data controller/sender of the marketing was unaware, and could not reasonably have been aware. Put another way, is the sending of direct electronic marketing on the basis of the objectively valid consent of someone who was subjectively incapable of giving valid consent, to be treated as lawfully sent?

“No”, said the High Court in the first instance. RTM was someone who, in his own submission, had gambled in circumstances, and to a degree he described as, “compulsive, out of control and destructive”, and claimed, in data protection and in misuse of private information, for damages, on the basis that, as he argued, Bonne Terre (operating as Sky Betting and Gaming, or “SBG”)

gathered and used extensive information, generated by his use of its platforms, unlawfully…especially by way of personalised and targeted marketing which he could not handle and which fed his compulsive behaviour

Mrs Justice Collins Rice DBE, had held, in her judgment, (even though RTM had not pleaded in these terms) that even though RTM had not lacked capacity to consent, and “that he wanted the direct marketing material – even perhaps craved it” he was one of a small subset (an “irreducible minimum”) of “individuals for whom decision-making…was already out of control in relation to gambling, and for whom the consenting mechanisms and information provision meant nothing other than barriers to gambling to be overcome”. Even though SBG had adopted controls in line with gambling regulatory requirements and expectations to avoid the risk of marketing to “problem gamblers” (the judge’s words) and even though these controls “can and do help manage and minimise the particular risks of direct marketing to online gamblers…they cannot and do not eliminate them”. This was because he “lacked subjective consent“; “the autonomous quality of his consenting behaviour was impaired to a real degree“; and “the quality of [his] consenting was rather lower than the standard required“, and “insufficiently freely given“.

The first instance judgment had presented all businesses, but especially those in the betting and gaming sector, with a problem and a risk: i) how could they establish in each case the subjective aspect of a data subject’s consent? and ii) if they could not establish that subjective aspect, how could they deal with the risk that marketing which would on the face of it be lawfully sent, would be held not to be, if the recipient was one of the irreducible minimum whose consent was not, subjectively, valid? Perhaps unsatisfactorily, the judge had said that this was

a risk which is ultimately ineradicable. Problem gamblers may not always be easy to recognise, and there will always be relevant information about them which is ultimately unreachable by the provider, and properly so because it is information which is itself in the private domain

The Court of Appeal has now roundly overturned the decision. Giving the main judgment, Lord Justice Warby revisited what a data controller must be able to demonstrate, in circumstances where consent is said to be present: the controller must “show that the data subject made a statement or took some other clear affirmative action…that ‘signifies agreement’”, they must also prove that “the data subject’s ‘indication’ met each of the four criteria prescribed by the legislation, namely that it was (i) freely given, (ii) specific, (iii) informed, and (iv) unambiguous”. All of these, he holds, are objective tests: “the data controller does not have to prove what was actually in the mind of the individual data subject at the time of the ‘indication’”.

In a classic example of judicial understatement, Warby LJ noted that the effect of the decision of the judge below was to establish a “principle that decisions deliberately made by a capacitous individual may nonetheless be vitiated for lack of consent” and further noted that it was a “legally novel” principle, whose “contours are not clear to me”.

Recitals 4 and 7 of the UK GDPR are relevant here. The first reminds us that

The right to the protection of personal data is not an absolute right; it must be considered in relation to its function in society and be balanced against other fundamental rights

and the second reminds us that

Legal and practical certainty for natural persons, economic operators and public authorities should be enhanced

As Warby LJ notes, an “inevitable corollary” of the original ruling would be that a business “could not guarantee its ability to ‘demonstrate’ conformity with the consent requirements of data protection law and PECR”, and

the unsatisfactory and ultimately opaque nature of the test for legally effective consent which the judge applied…would create considerable legal and practical uncertainty for economic operators

Absent a further appeal by RTM, which would need to be to the Supreme Court, and which would seem unlikely, the Court of Appeal has now gone a long way towards restoring legal and practical certainty as to the meaning of “consent” in data protection law, and how data controllers should approach the task of gathering and proving consent.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under consent, controller, cookies, Data Protection, judgments, marketing, PECR, UK GDPR

Extension to right to erasure

The Victims and Prisoners Act 2024 (Commencement No. 10) and Data (Use and Access) Act 2025 (Commencement No. 8) Regulations 2026 were made on 31 March 2026, bringing into effect an important amendment to the UK GDPR right to erasure.

In May 2024 Parliament enacted section 31 of the Victims and Prisoners Act 2024. Section 31 inserts a new Article 17(1)(g) into the UK GDPR, which invokes the right to erasure in the case of certain unfounded malicious allegations, where:

the personal data have been processed as a result of an allegation about the data subject—
(i) which was made by a person who is a malicious person in relation to the data subject (whether they became such a person before or after the allegation was made),
(ii) which has been investigated by the controller, and
(iii) in relation to which the controller has decided that no further action is to be taken

New Article 17(4) defines a “malicious person” as one who has been convicted of a specified offence or who is subject to a stalking protection order.

At the same time para 32 of Schedule 11 of the Data (Use and Access) Act, which extends the same provisions to Scotland and Northern Ireland, is also commenced.

The provisions were introduced to the 2024 Act by way of an amendment by Stella Creasy MP, informed in part by her own experiences of an entirely false and malicious allegation, and difficulties with expunging records of it (see here).

Leave a comment

Filed under accuracy, Data (Use and Access) Act, Data Protection, erasure, Legislation, UK GDPR

Erasure request in Children Act proceedings

[reposted from my LinkedIn account]

This is a rather extraordinary judgment in Children Act proceedings in the Family Court, in which a person who is an unregistered barrister and holds herself out as a lawyer, started out as a lay advocate to mother, then put herself forward to care for all of the children, seeking three times to be joined as a party.

In her skeleton argument in support of her final application to be joined, the court established – in a now wearily familiar way – that there were a number of cases of citations and propositions that were included as a result of using a “widely known publicly available AI tool to assist her in preparing [the argument]”.

She then informed the court that she was unable to continue to offer a home for the children or be part of the proceedings, that she no longer wished to proceed with the assessment to be either a special guardian or foster carer for the children, and sought for her assessment to be “formally withdrawn”. At the same time she asked for her data and that of her family members in the court bundle to be destroyed.

Unsurprisingly, the judge declined to do so (X v The Transcription Agency LLP and another [2024] 1 WLR 33 applied).

But more than that, in a judgment in which all of the parties’ identities (including the applicant public authority) are anonymised, she has been named, with the judge saying that she is “…a person who holds herself out as a lawyer. She offers, or has offered, paid legal work to members of the public. This is an important consideration. I am satisfied having read her written submissions lodged since the hearing that [she] still does not really acknowledge or accept that her actions in not checking the citations and propositions she included in her skeleton argument were serious”.

I think it’s important to note that the judge expresses some (although by no means complete) sympathy for aspects of the person’s position and personal circumstances, and the judgment states that she has self-referred to the Bar Standards Board. For that reason, I’m not naming in her in this post itself.

A, B, C, D, Re (Extension of assessment; Use of AI: hallucinations) [2026] EWFC 71 (B)

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, erasure, judgments

Freemasonry, JR and data protection

The High Court has refused to give permission to apply for judicial review upon an application by representative bodies of Freemasons, and by two Freemasons who are serving officers in the Met Police. The respondent was the Commissioner of the same force, and the impugned decisions relate to a new policy under which police officers and staff of the Met who are or have been members of “an organisation that has confidential membership, hierarchical structures and requires members to support and protect each other” to declare that fact, confidentially, to their local professional standards unit.

Among the proposed challenges were claims that the policy was an unlawful interference with officers’ and staffs’ qualified Convention rights under Articles 8 (right to respect for private and family life), 10 (freedom of expression) and 11 (freedom of assembly). On an assumption that the policy involved an interference with these rights, at this permission stage, said the judge, the question was whether there was a real prospect that the Court would find any interference with ECHR rights not to be justified, and the “key question” was whether any interference with the rights was proportionate. He could answer that question “confidently…even at this early stage”: the interference was modest, and the factors on the other side of the scales were compelling.

There was also a challenge on data protection grounds, to the effect, in part, that there was no lawful basis identified for the processing, and nor were purposes or limitations identified. Furthermore, special category data was involved. In answer to this, the judge pointed to the Met’s “appropriate policy document” (see paras 5 and 39 of Sch 1 Data Protection Act 2018) which provided “sufficient clarity”.

The judge also – and here I think he fell into minor error – said that any individual claimants had an alternative remedy, by way of complaint to the Information Commissioner’s Office and “if still dissatisfied, an appeal to the First-tier Tribunal”: but as the authorities make clear, there is no right of appeal to the Tribunal under such circumstances (section 166 Data Protection Act 2018 only allows a data subject to apply for a steps Order, where the ICO has failed to take appropriate steps to investigate a complaint – it does not provide a right of appeal). I doubt very much, though, that this apparent slight error has any real substance in the round.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, judicial review

Data protection complaints – a missed opportunity

Has the Information Commissioner’s Office ducked an opportunity to improve data subjects’ rights and provide regulatory clarity to data controllers?

Section 103 of the Data (Use and Access) Act 2025, which will come into effect on 19 June this year, inserts a new section 164A into the Data Protection Act 2018. It confers a right on data subjects to make a complaint to a data controller, and imposes a duty on controllers to facilitate this, and take appropriate steps to respond to any such complaint.

Perhaps surprisingly, Parliament chose to say that controllers must acknowledge receipt of complaints within 30 days (!), but chose not to specify a time frame for actually responding to them. Instead, controllers must simply “inform the complainant of the outcome…without undue delay”.

Last year the ICO ran a consultation on draft guidance for handling data subject complaints. In their now-published summary of responses to the consultation, the ICO explained that some people who responded questioned whether the ICO should lay down some guidance for how long a controller should take to respond to a complaint. In declining to do so, the ICO says

We recognise that organisations would like us to set out a specific time period within which we expect they should investigate the complaint. The legislation says “without undue delay”, which is context dependent. We’ve therefore provided advice around how to complete the investigation “without undue delay”./This will vary from one complaint to another, and from one organisation to another. A timeframe that is justifiable for one complaint may be unjustifiable for another.

All this is true, but I don’t really buy it. Legislation will quite often provide a broad framework for a procedure, with regulators or other overseers then producing good practice guidance.

It strikes me that it would have been straightforward for the ICO to say “Complaints must be responded to without undue delay. In most cases we would expect controllers to do so within [say] 40 days. Where this timeframe is exceeded we will expect controllers to explain why this did not constitute an undue delay”.

As it is, I can readily foresee some controllers taking many months to respond. As the ICO generally won’t accept complaints themselves until the data subject has received a response from the controller, this has the potential to build in even greater delay for data subjects.

(And all that is before we get to the issue of delays at the ICO’s end, and their new approach to complaints where, in effect, they will peremptorily dismiss some.)

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data (Use and Access) Act, Data Protection, Data Protection Act 2018, Information Commissioner

Have cookies fines just became a lot more likely?

Short answer: probably not, under the current ICO regime. But the fact that PECR are now enforced under the Data Protection Act 2018, rather than the 1998 Act, makes it in principle much easier for fines for cookie contraventions to happen.

By me, on the Mishcon de Reya website:

https://www.mishcon.com/news/unlawful-cookies-a-new-avenue-for-the-ico-to-issue-fines

Leave a comment

Filed under adtech, cookies, Data Protection, Data Protection Act 2018, fines, Information Commissioner, monetary penalty notice, Personal

DUAA commencement – what’s hot and what’s not

I’ve written for the Mishcon de Reya website on the commencement on 5 February of the majority of the data protection and eprivacy provisions of the Data (Use and Access) Act 2025: 

https://www.mishcon.com/news/data-protection-and-electronic-privacy-reform-whats-hot-and-whats-not

Leave a comment

Filed under charities, Data (Use and Access) Act, Data Protection, Data Protection Act 2018, marketing, PECR, UK GDPR

CoA: County Court is appropriate forum for routine data protection claim

This is a helpful short Court of Appeal judgment on the appropriate forum for a data protection of relatively low value and limited complexity (spoiler: it’s the County Court, folks).

The claimant had originally incorrectly issued his claim as a High Court media and communications claim in the Cardiff District Registry (if data protection claims are to be issued in the High Court, they must be issued in the King’s Bench Division at the Royal Courts of Justice). The judge in the High Court in Cardiff transferred the claim to the County Court but his order arguably contained insufficient reasons, and did not explain that either party could apply to have it set aside or varied (as required by CPR 3.3(5)(b). The claimant tried to make representations, by way of an email, as to why the High Court was the appropriate forum, but this was rejected on the basis that it had been filed in the wrong court. By that stage, the transfer to the County Court had taken effect. Accordingly, the matters arising could only be determined by way of appeal.

In its determination, the CoA found that the case (involving disclosure, in separate proceedings, of medical information by a court security guard to an usher and a solicitor for a third party) did not appear to involve any factual or legal complexity, and the claimed sum of £30,000 was clearly within the ambit of the County Court.

(I interject here to observe that, on the brief facts as recorded in the judgment, there might have been some legal complexity – it seems likely that the disclosure would have been made orally by the security guard, so was there “processing” involved?)

Wysokinski v OCS Security Ltd [2026] EWCA Civ 26 

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, judgments, UK GDPR

A DSAR disclosure horror story

If anyone who deals with data subject access requests, or disclosure exercises in general, wants to read a horror story, they should look at the recent judgment in Forsters LLP v Uddin [2025] EWHC 3255 (KB).

This was an application for an interim injunction for breach of confidence, seeking delivery up by the defendant of confidential and privileged documents. Forsters, a law firm, act for Mr and Mrs Alloatti, who are in a dispute with their neighbour, Mr Uddin. No doubt in an attempt to advance his case, Mr Uddin made a DSAR directly to Forsters. But instead of disclosing Mr Uddin’s personal data to him, Forsters disclosed the entire contents of the file containing information responsive to a systems search for the name “Uddin”. This resulted not only in the disclosure of personal data of people unconnected to the dispute, but also in disclosure of around 95% (3,000+ pages) of the Alloatti client file, much of it confidential and privileged.

Unsurprisingly, Forsters were successful in their application. This was a very clear case of “obvious mistake” (see Fayed v Commissioner of Police of the Metropolis [2002] EWCA Civ 780). And

where a party to litigation discloses documents to the opposing party which are confidential and privileged and the court is satisfied that it is a case of ‘obvious mistake’, which was either known to or ought to have been known to the receiving party, the Court will intervene by injunction to, so far as possible, put the parties back into the position they would have been had the error not occurred. This will usually involve granting an injunction that requires the recipient to deliver up the documents, to destroy any copies he has made of them and which restrains him from making any use of the information contained in the documents.

Further proof that this was a mistake lay in the fact that Mr Uddin, on receiving the disclosure, immediately notified Forsters of the breaches of confidence and GDPR. Although he later sought to row back on this in order to retain and use the information in his dispute with the Alloattis, his argument that the disclosure was lawful as a DSAR response was doomed.

One argument that found greater favour with the judge was that the “erroneous disclosure to him has undermined the confidentiality and privilege in the information he has seen”. But although the judge accepted that Mr Uddin could not “un-know” some of what he had seen he held that

Nonetheless, the court can help the Claimant to regain control over the 3,300 documents themselves and over the way in which information from those documents is deployed in the two claims. In this way, the court can remedy most of the mischief which this inadvertent disclosure has caused

Accordingly, in addition to delivery up and deletion, he was injuncted from using any of the documents, or information from them, in the underlying claim or in a separate claim in harassment against two Forsters employees.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Breach of confidence, Data Protection, judgments, subject access

NCND for personal data – a qualified exemption?

[reposted from my LinkedIn Account]

I’ve been known to criticise First-tier Tribunal (FTT) judgments in the freedom of information jurisdiction. By contrast, this one is superb.

In it, the FTT dismantle the argument (and the decision notice) of the Information Commissioner’s Office that Bolton NHS Foundation Trust were entitled to “neither confirm nor deny” (NCND) holding reviews, including a review by PWC, into the Trust’s governance and management. The PWC review was the subject of an article in the Health Service Journal, and the requester was the journalist, Lawrence Dunhill.

Firstly, the FTT noted that the ICO “case begins with an elementary error of fact. It treats the Trust as having given an NCND response to the entirety of the Request when it did no such thing” (the Trust had only applied NCND in respect of the request for a PWC report, but had confirmed it held other reviews). Oddly, the Trust, in its submissions for the appeal, simply ignored this error (the FTT chose not to speculate on “whether that omission was accidental or tactical”).

Secondly, and notably, the FTT found a fundamental error of law in the ICO’s approach (and, by implication, in its guidance) to NCND in the context of personal data. Section 2(3)(fa) of FOIA provides that section 40(2) is an absolute exemption (therefore not subject to a public interest test). But section 2(3) does not include section 40(5B) (the personal data NCND provision) in the list of absolute exemptions. As far as I know, the ICO has always taken the view, however, that it is an absolute exemption – certainly its current guidance says this).

That approach, held the FTT, is “simply wrong…the exemption under FOIA, s40(5B)(a)(i) is qualified and the public interest balancing test applies”. And but for that error, they said, the ICO might have reached a different conclusion.

As it was, the FTT held that the legitimate interests balancing test under Article 6(1)(f) of the UK GDPR was sufficient to determine the issue: merely confirming or denying whether the PWC review was held would not cause unwarranted prejudice to a named individual when balanced against the requester’s legitimate interests.

It will be interesting to see if the ICO appeal this. Given the strength of the criticism it would perhaps be bold to do so, but it might be that the only alternative will be to have to rewrite their guidance on s40(5), and rethink their long-held view on it.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, FOIA, Freedom of Information, Information Commissioner, Information Tribunal, judgments, NCND, UK GDPR