Category Archives: GDPR

Oral disclosure of personal data: a new domestic case

“Pretexting” and “blagging” are forms of social engineering whereby someone attempts to extract information from a source by deception. One (unethical) example is when a journalist purports to be someone else in order to gather information for a story.

A recent misuse of private information and data protection judgment in the High Court deals with a different, and sadly not uncommon, example – where an estranged, abusive partner convinced a third party to give information about their partner so they can continue their harassment of them.

The claimant had worked at a JD Wetherspoon pub, but had left a few months previously. She had given her contact details, including her mother’s mobile phone number, to her manager, and the details were kept in a paper file, marked “Strictly Private and Confidential”, in a locked filing cabinet. During the time she was employed she had been the victim of offences by a former partner of serious violence and harassment which involved subjecting her to many unwanted phone calls. He was ultimately convicted of these and sentenced to 2 ½ years in prison. Her employer was aware of the claimant’s concerns about him.

While her abuser was on remand, he rang the pub, pretending to be a police officer who needed to contact the claimant urgently. Although the pub chain had guidance on pretexting, under which such attempts to acquire information should be declined initially and referred to head office, the pub gave out the claimant’s mother’s number to the abuser, who then managed to speak to (and verbally abuse) the claimant, causing understandable distress.

She brought claims in the county court in misuse of private information, breach of confidence and for breach of data protection law. She succeeded at first instance with the first two, but not with the data protection claim. Wetherspoons appealed and she cross-challenged, not by appeal but by way of a respondent’s notice, the rejection of the data protection claim.

In a well-reasoned judgment in Raine v JD Wetherspoon PLC [2025] EWHC 1593 (KB), Mr Justice Bright dismissed the defendant’s appeals. He rejected their argument that the Claimant’s mother’s mobile phone number did not constitute the Claimant’s information or alternatively that it was not information in which she had a reasonable expectation of privacy: it was not ownership of the mobile phone that mattered, nor ownership of the account relating to it – what was relevant was information: the knowledge of the relevant digits. As between the claimant and the defendant, that was the claimant’s information, which was undoubtedly private when given to the defendants and was intended to remain private, rather than being published to others.

The defendant then argued that there can be no cause of action for misuse of private information if the Claimant is unable to establish a claim under the DPA/GDPR, and, relatedly, that a data security duty could not arise under the scope of the tortious cause of action of misuse of private information. In all honesty I struggle to understand this argument, at least as articulated in the judgment, probably because, as the judge suggests, this was not a data security case involving failure to take measures to secure the information. Rather, it involved a positive act of misuse: the positive disclosure of the information by the defendant to the abuser.

The broadly similar appeal grounds in relation to breach of confidence failed, for broadly similar reasons.

The counter challenge to the prior dismissal of the data protection claim, by contrast, succeeded. At first instance, the recorder had accepted the defendant’s argument that this was a case of purely oral disclosure of information, and that, applying Scott v LGBT Foundation Limited, this was not “processing” of “personal data”. However, as the judge found, in Scott,

the information had only ever been provided to the defendant orally; and…then retained not in electronic or manual form in a filing system, but only in the memory of the individual who had received the original oral disclosure…In that case, there was no record, and no processing. Here, there was a record of the relevant information, and it was processed: the personnel file was accessed by [the defendant’s employee], the relevant information was extracted by her and provided in written form to [another employee], for him to communicate to [the abuser].

This fell “squarely within the definition of ‘processing’ in the GDPR at article 4(2)”. Furthermore, there was judicial authority in Holyoake v Candy that, in some circumstances, oral disclosure will constitute processing (a view supported by the European Court in Endemol Shine Finland Oy).

Damages for personal injury, in the form of exacerbation of existing psychological damage, of £4500 were upheld.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Breach of confidence, Data Protection, data sharing, GDPR, judgments, misuse of private information, Oral disclosure

Recital 63 of the GDPR is nonsensical

[reposted from my LinkedIn account]

I’m sure I’ve mentioned this before (but that sort of thing never stops me banging on about stuff) but whenever I read recital 63 of the GDPR it irritates me, because a comma is in the wrong place. The result is that the clause in question is slightly nonsensical. It reads:

A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing.

The literal reading of that clause is that the right of access exists in order that a data subject can be “aware of the lawfulness” of processing and “verify the lawfulness” of processing. The latter is fine on its own but what does the former mean? And if one becomes “aware of the lawfulness” of the processing then why should one then “verify” it?

Surely the need is to be aware of the processing, and then verify its lawfulness?

Clearly, the comma should be moved, so it says

…in order to be aware of, and verify the lawfulness of, the processing.

And when I’m Prime Minister a UK GDPR (Recital 63 Correction) Amendment Bill is the first thing I will table.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, GDPR, nonsense, subject access, UK GDPR

Personal use of work devices – an Irish judgment

A frequent headache for data protection practitioners and lawyers is how to separate (conceptually and actually) professional and personal information on work devices and accounts. It is a rare employer (and an even rarer employee) who doesn’t encounter a mix of the two categories.

But, if I use, say, my work phone to send a couple of text messages (as I did on Saturday after the stupid SIM in my personal phone decided to stop working), who is the controller of the personal data involved in that activity? I’d be minded to say that I am, (and that my employer becomes, at most, a processor).

That is also the view taken by the High Court in Ireland, in an interesting recent judgment.

The applicant was an employee of the Health Service Executive (HSE), and did not, in this case, have authority or permission to use his work phone for personal use. He nonetheless did so, and then claimed that a major data breach in 2021 at the HSE led to his personal email account and a cryptocurrency account being hacked, with a resultant loss of €1400. He complained to the Irish Data Protection Commissioner, who said that as his personal use was not authorised, the HSE was not the controller in respect of the personal data at issue.

The applicant sought judicial review of the DPC decision. This of course meant the application would only succeed if it met the high bar of showing that the DPC had acted unlawfully or irrationally. That bar was not met, with the judge holding that:

The DPC did not purport to adopt an unorthodox interpretation of the definition of data controller. Instead, against the backdrop of the factual matrix before it, it found that the HSE had not “determined the purposes and means 28of the processing” of the data relating to the Gmail, Yahoo, Fitbit and Binance accounts accessed by the applicant on his work phone. That finding appears to me to be self-evident, where that use of the phone clearly was not authorised by the HSE.

I think that has to be correct. But I’m not sure I quite accept the full premise, because I think that even if the HSE had authorised personal use, the legal position would be the same (although possibly not quite as unequivocally so).

In genuinely interested in others’ thoughts though.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under controller, Data Protection, employment, GDPR, Ireland, judgments, Uncategorized

A new data protection duty?

I’ve been looking in more detail at the recent subject access judgment in Ashley v HMRC. One key point of general application stands out for me, and that is that it states that in some cases (i.e. where it is necessary for intelligibility purposes) a controller has a duty to provide contextual information in addition to copies of personal data.

As the judge put it

Article 15(1) and 15(3), read with Article 12(1) and (2) of the UK GDPR, did require the Defendant to go beyond providing a copy of the Claimant’s personal data where contextual information was necessary for that personal data to be intelligible in the sense of enabling the data subject to exercise their rights conferred by the UK GDPR effectively. It follows that insofar as the Defendant did not adopt this approach, it was in breach of this duty.

And although she couched the following as “guidance” for the HMRC when reconsidering the request, I feel it has general application:

…it is unlikely that providing an extract that simply comprises the Claimant’s name or his initials or other entirely decontextualised personal data of that sort, will amount to compliance with this obligation.

In arriving at this conclusion the judge drew in part on both pre- and post-Brexit case law of the Court of Justice of the European Union. Most notably she decided to have regard to case C-487/21. Even though this does not bind the domestic courts, the effect of section 6(2) of European Union (Withdrawal) Act 2018 is that courts may have regard to EU case law where it is relevant to the matter before them.

Of course, there are also times when merely providing a snippet in the form of a name constitutes a failure to provide all of the personal data in scope (omitting the final five words of “Jon Baines works at Mishcon de Reya” would be to omit some of my personal data). But the “context duty” seems to me to go further, and creates, where it is necessary, an obligation to provide information beyond what is in the source documents.

Most of the other points in the judgment, as important as they were to the facts, and as interesting they are, particularly on the concept of “relating to” in the definition of “personal data”, will not necessarily change things for most data subjects and controllers.

But this “context duty” feels to me to be an advancement of the law. And I suspect controllers can now expect to see data subjects and their lawyers, when making subject access requests (or when challenging responses), begin to argue that the “context duty” applies.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, judgments, subject access, UK GDPR

Consent is not the only basis

In 2017 I attended a free event run by a “GDPR consultancy”. The presenter confidently told us that we were going to have to get consent from customers in order to process their personal data. One attendee said they worked at the DWP, so how were they going to get consent from benefits claimants who didn’t want to disclose their income, to which the presenter rather awkwardly said “I think that’s one you’ll have to discuss with your lawyers”. Another attendee, who was now most irritated that he’d taken time out from work for this, could hold his thoughts in no longer, and rudely announced that this was complete nonsense.

That attendee was the – much ruder in those days – 2017 version of me.

I never imagined (although I probably should have done) that eight years on the same nonsense would still be spouted.

Just as the Data Protection Act 2018 did not implement the GDPR in the UK (despite the embarrassing government page that until recently, despite people raising it countless times, said so) just as the GDPR does not limit its protections to “EU citizens”, so GDPR and the UK GDPR do not require consent for all processing.

Anyone who says so has not applied a smidgeon of thought or research to the question, and is probably taking content from generative AI, which, on the time-honoured principle of garbage-in, garbage-out, has been in part trained on the existing nonsense. To realise why it’s garbage, they should just start with the DWP example above and work outwards from there.

Consent is one of the six lawful bases, any one or more of which can justify processing. No one basis is better than or takes precedence over the other.

To those who know this, I apologise for having to write it down, but I want to have a sign to tap for any time I see someone amplifying the garbage on LinkedIn.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, DWP, GDPR, Let's Blame Data Protection, UK GDPR

Cookies, compliance and individuated consent

[reposted from my LinkedIn account]

Much will be written about the recent High Court judgment on cookies, direct marketing and consent, in RTM v Bonne Terre & Anor, but treat it all (including, of course, this, with caution).

This was a damages claim by a person with a gambling disorder. The claim was, in terms, that the defendant’s tracking of his online activities, and associated serving of direct marketing, were unlawful, because they lacked his operative consent, and they led to damage because they caused him to gamble well beyond his means. The judgment was only on liability, and at the time of writing this post there has been no ruling on remedy, or quantum of damages.

The domestic courts are not regulators – they decide individual cases, and where a damages claim is made by an individual any judicial analysis is likely to be highly fact specific. That is certainly the case here, and paragraphs 179-181 are key:

such points of criticism as can be made of [the defendant’s] privacy policies and consenting mechanisms…are not made wholesale or in a vacuum. Nor are they concerned with any broader question about best practice at the time, nor with the wisdom of relying on this evidential base in general for the presence of the consents in turn relied on for the lawfulness of the processing undertaken. Such general matters are the proper domain of the regulators.

In this case, the defendant could not defeat a challenge that in the case of this claimant its policies and consenting mechanisms were insufficient:

If challenged by an individual data subject, a data controller has to be able to demonstrate the consenting it relies on in a particular case. And if that challenge is put in front of a court, a court must decide on the balance of probabilities, and within the full factual matrix placed before it, whether the data controller had a lawful consent basis for processing the data in question or not.

Does this mean that a controller has to get some sort of separate, individuated consent for every data subject? Of course not: but that does not mean that a controller whose policies and consenting mechanisms are adequate in the vast majority of cases is fully insulated from a specific challenge from someone who could not give operative consent:

In the overwhelming majority of cases – perhaps nearly always – a data controller providing careful consenting mechanisms and good quality, accessible, privacy information will not face a consent challenge. Such data controllers will have equipped almost all of their data subjects to make autonomous decisions about the consents they give and to take such control as they wish of their personal data…But all of that is consistent with an ineradicable minimum of cases where the best processes and the most robust evidential provisions do not, in fact, establish the necessary presence of autonomous decision-making, because there is specific evidence to the contrary.

This is, one feels, correct as a matter of law, but it is hardly a happy situation for those tasked with assessing legal risk.

And the judgment should (but of course won’t) silence those who promise, or announce, “full compliance” with data protection and electronic marketing law.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adtech, consent, cookies, Data Protection, GDPR, judgments, marketing, PECR, Uncategorized

The Data Protection Act 2018 does not “implement” the GDPR

They are separate instruments and the GDPR, pre-Brexit, did not require implementation – as a Regulation of the European Parliament and of the Council of the European Union, it had direct effect.

Since Brexit, by the effect of, among other laws, the European Union (Withdrawal Act) 2018 and the Data Protection, Privacy and Electronic Communications (Amendments Etc.) (EU Exit) Regulations 2019, we now have a retained-and-assimilated domestic version of the GDPR, called the UK GDPR.

Most processing of personal data is subject to the UK GDPR. The Data Protection Act 2018 deals with processing that is not subject to it, such as by law enforcement and security service agencies. It also provides some of the conditions and exemptions in relation to processing under the UK GDPR.

[None of this is new, and none of it will be unknown to genuine practitioners in the field, but I’m posting it here as a convenient sign to tap, at appropriate moments.]

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, UK GDPR

ICO, Clearview AI and Tribunal delays

[reposted from LinkedIn]

On 28 October the Information Commissioner’s Office (ICO) made the following statement in respect of the November 2023 judgment of the First Tier Tribunal upholding Clearview AI’s successful appeal of the ICO’s £7.5m fine, and posted it in an update to its original announcement about appealing:

The Commissioner has renewed his application for permission to appeal the First-tier Tribunal’s judgment to the Upper Tribunal, having now received notification that the FTT refused permission of the application filed in November 2023.

It is extraordinary that it has taken 11 months to get to this point.

So what does this mean?

If a party (here, the ICO) wishes to appeal a judgment by the First Tier Tribunal (FTT) to the next level Upper Tribunal (UT), they must first make an application to the FTT itself, which must decide “as soon as practicable” whether to grant permission to appeal its own judgment (rules 42 and 43 of the Tribunal Procedure (First-tier Tribunal) (General Regulatory Chamber) Rules 2009).

If the FTT refuses permission to appeal (as has happened here), the application may be “renewed” (i.e. made again) directly to the UT itself (rule 21(2) of the Tribunal Procedure (Upper Tribunal) Rules 2008).

So, here, after 11 months (“as soon as reasonably practicable”?) the ICO has just had its initial application refused, and is now going to make an applicant under rule 21(2) of the UT Rules.

The ICO’s wording in its statement is slightly odd though: it talks of “having now received notification” that the FTT “refused” (not, say, “has now refused”) the November 2023 application. The tense used half implies that the refusal happened at the time and they’ve only just been told. If so, something must have gone badly wrong at the Tribunal.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, GDPR, Information Commissioner, Information Tribunal, judgments, Upper Tribunal

Crowdstrike and personal data breaches: loss vs unavailability

I ran a poll on LinkedIn in recent days which asked “If a controller temporarily can’t access personal data on its systems because of the Crowdstrike/MSFT incident is it a personal data breach?” 

I worded the question carefully.

50% of the 100-odd people who voted said “no” and 50% said “yes”. The latter group are wrong. I say this with some trepidation because there are people in that group whose opinion I greatly respect. 

But here’s why they, and, indeed, the Information Commissioner’s Office and the European Data Protection Board, are wrong.

Article 4(12) of the GDPR/UK GDPR defines a “personal data breach”. This means that it is a thing in itself. And that is why I try always to use the full term, or abbreviate it, as I will here, to “PDB”. 

This is about the law, and in law, words are important. To refer to a PDB as the single word “breach” is a potential cause of confusion, and both the ICO and the EDPB guidance are infected by and diminished by sloppy conflation of the terms “personal data breach” and “breach”. In English, at least, and in English law, the word “breach” will often be used to refer to a contravention of a legal obligation: a “breach of the law”. (And in information security terminology, a “breach” is generally used to refer to any sort of security breach.) But a “breach” is not coterminous with a “personal data breach”.

And a PDB is not a breach of the law: it is a neutral thing. It is also crucial to note that nowhere do the GDPR/UK GDPR say that there is an obligation on a person (whether controller or processor) not to experience a PDB, and nowhere do GDPR/UK GDPR create liability for failing to prevent one occurring. This does not mean that where a PDB has occurred because of an infringement of other provisions which do create obligations and do confer liability (primarily Article 5(1)(f) and Article 32) there is no potential liability. But not every PDB arises from an infringement of those provisions.

The Article 4(12) definition is “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”. Let us break that down:

  • A breach of security…
  • leading to [one or more of]
  • accidental or unlawful…
  • 1. destruction of…
  • 2. loss of…
  • 3. alteration of…
  • 4. unauthorised disclosure of…
  • 5. unauthorised access to…
  • personal data processed.

If an incident is not a breach of security, then it’s not a PDB. And if it is a breach of security but doesn’t involve personal data, it’s not a PDB. But even if it is a breach of security, and involves personal data, it’s only a PDB if one of the eventualities I’ve numbered 1 to 5 occurs.

Note that nowhere in 1 to 5 is there “unavailability of…” or “loss of access to…”. 

Now, both the ICO, and the EDPB, read into the words “loss of…personal data…” the meaning, or potential meaning “loss of availability of personal data”. But in both cases they appear to do so in the context of saying, in terms, “loss of availability is Article 4(12) ‘loss’ because it can cause harm to data subjects”. I don’t dispute, and nor will many millions of people affected by the Crowdstrike incident, that unavailability of personal data can cause harm. But to me, “loss” means loss: I had something, and I no longer have it. I believe that that is how a judge in the England and Wales courts would read the plain words of Article 4(12), and decide that if the legislator had intended “loss” to mean something more than the plain meaning of “loss” – so that it included a meaning of “temporary lack of access to” – then the legislator would have said so. 

Quite frankly, I believe the ICO and EDPB guidance are reading into the plain wording of the law a meaning which they would like to see, and they are straining that plain wording beyond what is permissible.

The reason, of course, that this has some importance is that Article 33 of the GDPR/UK GDPR provides that “in the case of” (note the neutral, “passive” language) a PDB, a controller must in general make a notification to the supervisory authority (which, in the UK, is the ICO), and Article 34 provides that where a PDB is likely to result in a high risk to the rights and freedoms of natural persons, those persons should be notified. If a PDB has not occurred, no obligation to make such notifications arises. That does not mean of course, that notifications cannot be made, through an exercise of discretion (let’s forget for the time being – because they silently resiled from the point – that the ICO once bizarrely and cruelly suggested that unnecessary Article 33 notifications might be a contravention of the GDPR accountability principle.)

It might well be that the actions or omissions leading to a PDB would constitute an infringement of Articles 5(1)(f) and 32, but if an incident does not meet the definition in Article 4(12), then it’s not a PDB, and no notification obligation arises. (Note that this is an analysis of the position under the GDPR/UK GDPR – I am not dealing with whether notification obligations to any other regulator arise.)

I can’t pretend I’m wholly comfortable saying to 50% of the data protection community, and to the ICO and EDPB, that they’re wrong on this point, but I’m comfortable that I have a good arguable position, and that it’s one that a judge would, on balance agree with. 

If I’m right, maybe the legislator of the GDPR/UK GDPR missed something, and maybe availability issues should be contained within the Article 4(12) definition. If so, there’s nothing to stop both the UK and the EU legislators amending Article 4(12) accordingly. And if I’m wrong, there’s nothing to stop them amending it to make it more clear. In the UK, in particular, with a new, energised government, a new Minister for Data Protection, and a legislative agenda that will include bills dealing with data issues, this would be relatively straightforward. Let’s see.

And I would not criticise any controller which decided it was appropriate to make an Article 33 notification. It might, on balance, be the prudent thing for some affected controllers to do so. The 50/50 split on my poll indicates the level of uncertainty on the part of the profession. One also suspects that the ICO and the EU supervisory authorities might get a lot of precautionary notifications.

Heck, I’ll say it – if anyone wants to instruct me and my firm to advise, both on law and on legal strategy – we would of course be delighted to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, EDPB, GDPR, Information Commissioner, Let's Blame Data Protection, LinkedIn Post, personal data breach, UK GDPR

“Mom, we have discussed this”

A few years ago Gwyneth Paltrow’s daughter Apple took to social media to gently berate her mother for posting an image (not this one) which included her: “You may not post anything without my consent”. I’ve no idea whether Apple has other fine qualities, but I admired her approach here.

I was reminded of it by the – also admirable – approach by the Prime Minister and his wife to their two children’s privacy. Remarkably, it appears that their names and photographs have so far been kept from publication. It’s doubtful that will be able to continue forever (in any case, the children are at or coming to an age where they can take their own decisions) but I like the marked contrast with how many senior politicians co-opt their children into their campaigning platform.

One of the concerns of the legislator, when GDPR was being drafted, was children’s rights: recital 65 specifically addresses the situation of where a child has consented to publication of their data online, but later wants it removed.

Although Gwyneth Paltrow’s publishing of her child’s image would likely have been out of the material scope of GDPR under Article 2(2)(a) (and quite possibly out of its territorial scope) the thrust of recital 38 should apply generally: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data”.

[Image licensed under CC BY-NC 4.0, creator not stated. Image altered to obscure children’s faces]

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under children, consent, Data Protection, GDPR, Privacy, UK GDPR