Exceptionally unlikely: ICO and judicial review

[reposted from my LinkedIn account]

Where Parliament has entrusted a specialist body with bringing prosecutions, such as the Serious Fraud Office, or the Information Commissioner’s Office (ICO), it is “only in highly exceptional circumstances” that a court will disturb a decision made by that body (see Lord Bingham in R(Corner House and others) v Director of the Serious Fraud Office [2008] UKHL 60)).

Such was the situation faced by the claimant in an unsuccessful recent application for judicial review of two decisions of the ICO.

The claimant, at the time of the events in question, was a member of the Labour Party and of the Party’s “LGBT+Labour” group, She had been concerned about an apparent disclosure of the identity and trans status of 120 members of a “Trans Forum” of the group, of which she was also a member, and of what she felt was a failure by the LGBT+Labour group to inform members of the Forum of what had happened.

She reported this to the ICO as potential offences under sections 170 and 173 of the Data Protection Act 2018 (it’s not entirely clear what specific offences would have been committed), and she asked whether she was “able to discuss matters relating to potential data breaches with the individuals involved”. The ICO ultimately declined to prosecute, and also informed her that disclosing information to the individuals could in itself “potentially be a section 170 offence”.

The application for judicial review was i) in respect of the “warning” about a potential prosecution in the event she disclosed information to those data subjects, and her subsequent rejected request for a commitment that she would not be prosecuted, and ii) in respect of the decision not to prosecute LGBT+Labour.

Neither application for permission succeeded. In the first case, there was no decision capable of being challenged: it was an uncontroversial statement by the ICO about a hypothetical and fact-sensitive future situation, and in any event she was out of time in bringing the application. In the second case, there were no “highly exceptional circumstances” that would enable the court “to consider there was a realistic prospect of showing that the ICO had acted outside the wide range of its discretion when deciding not to prosecute”.

One often sees suggestions that the ICO should be JRd over its failure to take action (often in a civil context). This case illustrates the deference that the courts will give to its status and expertise both as regulator and prosecutor. Outside the most exceptional of cases, such challenges are highly unlikely to succeed.

Peto v Information Commissioner [2025] EWHC 146 (Admin)

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under crime, Data Protection, Data Protection Act 2018, Information Commissioner, judgments, judicial review

Closed MI5 material in the Information Tribunal

You don’t know what you don’t know.

A recent judgment in the Information Tribunal is a good example of this platitude in the context of access to information held by public authorities.

The applicant had asked MI5, under the Environmental Information Regulations 2004 (EIR) for information on its CO2 emissions (by reference to the Greenhouse Gas Protocol). MI5 refuse to disclose in reliance on the exception to disclosure at regulation 12(5)(a), on the grounds that disclosure would adversely affect national security. This refusal was upheld by the Information Commissioner’s Office.

Perhaps unsurprisingly, the applicant was sceptical. The judgment notes that

she said that MI5 had not demonstrated a causal link between the disclosure of the information and the claimed adverse effect of that disclosure; MI5 had not provided any evidence that the adverse effect of disclosure was more likely than not to occur. She described the position of MI5 to be based on assumptions and that they had overlooked the difficulty of inferring accurate information from emissions data

The Information Tribunal can, though, consider closed material in EIR and FOI processing (ie information and evidence which the applicant cannot see/hear). And in this case, MI5 adduced closed evidence, in the form of “damage assessments” which

included submissions as to how the emissions data could be used and the nature of the conclusions that could be drawn from those data, whether analysing the data alone, by also using data in the public domain or by using comparators” and “identified stark and very accurate conclusions that could be drawn from the raw data itself with simple calculations

In the face of such evidence, the Tribunal inevitably dismissed the applicant’s appeal.

The judgment is well worth reading as an illustration of how the closed material procedure works.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under access to information, Environmental Information Regulations, Information Tribunal, national security

Consent is not the only basis

In 2017 I attended a free event run by a “GDPR consultancy”. The presenter confidently told us that we were going to have to get consent from customers in order to process their personal data. One attendee said they worked at the DWP, so how were they going to get consent from benefits claimants who didn’t want to disclose their income, to which the presenter rather awkwardly said “I think that’s one you’ll have to discuss with your lawyers”. Another attendee, who was now most irritated that he’d taken time out from work for this, could hold his thoughts in no longer, and rudely announced that this was complete nonsense.

That attendee was the – much ruder in those days – 2017 version of me.

I never imagined (although I probably should have done) that eight years on the same nonsense would still be spouted.

Just as the Data Protection Act 2018 did not implement the GDPR in the UK (despite the embarrassing government page that until recently, despite people raising it countless times, said so) just as the GDPR does not limit its protections to “EU citizens”, so GDPR and the UK GDPR do not require consent for all processing.

Anyone who says so has not applied a smidgeon of thought or research to the question, and is probably taking content from generative AI, which, on the time-honoured principle of garbage-in, garbage-out, has been in part trained on the existing nonsense. To realise why it’s garbage, they should just start with the DWP example above and work outwards from there.

Consent is one of the six lawful bases, any one or more of which can justify processing. No one basis is better than or takes precedence over the other.

To those who know this, I apologise for having to write it down, but I want to have a sign to tap for any time I see someone amplifying the garbage on LinkedIn.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, DWP, GDPR, Let's Blame Data Protection, UK GDPR

Cookies, compliance and individuated consent

[reposted from my LinkedIn account]

Much will be written about the recent High Court judgment on cookies, direct marketing and consent, in RTM v Bonne Terre & Anor, but treat it all (including, of course, this, with caution).

This was a damages claim by a person with a gambling disorder. The claim was, in terms, that the defendant’s tracking of his online activities, and associated serving of direct marketing, were unlawful, because they lacked his operative consent, and they led to damage because they caused him to gamble well beyond his means. The judgment was only on liability, and at the time of writing this post there has been no ruling on remedy, or quantum of damages.

The domestic courts are not regulators – they decide individual cases, and where a damages claim is made by an individual any judicial analysis is likely to be highly fact specific. That is certainly the case here, and paragraphs 179-181 are key:

such points of criticism as can be made of [the defendant’s] privacy policies and consenting mechanisms…are not made wholesale or in a vacuum. Nor are they concerned with any broader question about best practice at the time, nor with the wisdom of relying on this evidential base in general for the presence of the consents in turn relied on for the lawfulness of the processing undertaken. Such general matters are the proper domain of the regulators.

In this case, the defendant could not defeat a challenge that in the case of this claimant its policies and consenting mechanisms were insufficient:

If challenged by an individual data subject, a data controller has to be able to demonstrate the consenting it relies on in a particular case. And if that challenge is put in front of a court, a court must decide on the balance of probabilities, and within the full factual matrix placed before it, whether the data controller had a lawful consent basis for processing the data in question or not.

Does this mean that a controller has to get some sort of separate, individuated consent for every data subject? Of course not: but that does not mean that a controller whose policies and consenting mechanisms are adequate in the vast majority of cases is fully insulated from a specific challenge from someone who could not give operative consent:

In the overwhelming majority of cases – perhaps nearly always – a data controller providing careful consenting mechanisms and good quality, accessible, privacy information will not face a consent challenge. Such data controllers will have equipped almost all of their data subjects to make autonomous decisions about the consents they give and to take such control as they wish of their personal data…But all of that is consistent with an ineradicable minimum of cases where the best processes and the most robust evidential provisions do not, in fact, establish the necessary presence of autonomous decision-making, because there is specific evidence to the contrary.

This is, one feels, correct as a matter of law, but it is hardly a happy situation for those tasked with assessing legal risk.

And the judgment should (but of course won’t) silence those who promise, or announce, “full compliance” with data protection and electronic marketing law.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adtech, consent, cookies, Data Protection, GDPR, judgments, marketing, PECR, Uncategorized

Disclosing details of successful candidates from jobs

[reposted from my LinkedIn account]

Jones v Secretary of State for Health And Social Care [2024] EWCA Civ 1568

A question for data protection advisers. If you are asked by an unsuccessful candidate for a job what the age, gender and ethnic origin of the successful candidate was, do you disclose? (And what is your Article 6 basis and Article 9 UK GDPR condition for doing so?)

These questions are prompted by an interesting employment case in the Court of Appeal.

The appellant, who self-describes as black Caribbean, interviewed for a business development role at Public Health England (PHE) on 28 March 2019 but was not told, despite chasing, until 3 July 2019 that he had been unsuccessful. This was already outside the primary three month limitation period for bringing a claim in the employment tribunal (ET).

He then asked PHE for “age, gender and ethnic origin” of the successful candidate, and explained he needed to information to decide whether or not to make a claim in the ET.

It is not entirely clear what then happened: it’s suggested that PHE initially refused, but told the claimant he could make an FOI request, and there is also a suggestion that he was told that if he provided proof of his identity they would provide the information. In any event, he was not informed until much later in the proceedings that the successful candidate was white British.

His ET claim for discrimination was, therefore, submitted out of time. The ET can only extend the time for such a claim where it is “just and equitable” to do so, and, here, the ET held that it was not: he put off making his claim “because he was on an information gathering exercise. He was looking for the evidence to bolster his claim…Despite the Claimant’s criticisms, the respondent did in fact provide him with information and an explanation of its actions quite early on in the chronology. It gave him enough information to know that there was a claim for him to make if he wanted to present it to the Tribunal”. And, in any case, the ET dismissed the claim on its merits.

On appeal to the Employment Appeal Tribunal (EAT) the claimant submitted that it had been perverse of the ET to refuse to exercise its discretion to extend the time for making the application, but the EAT held that the ET had made no error of law in that regard.

The Court of Appeal felt differently; it was wrong for the ET to have held that the claimant had had, much earlier, the “raw materials” on which to formulate his claim, and it although it was correct that he was looking for information to bolster his claim, this ought not to have been held against him. “The information he was seeking about the ethnicity of the successful candidate was an essential part of his claim”.

Accordingly, the ET’s decision not to extend time under the “just and equitable” test was perverse, and the order of the EAT to uphold that decision was set aside, and the case on merits was remitted to the EAT.

And I guess my answer to my own questions at the start of this post would be: one or both of Articles 6(1)(c) and 6(1)(f), and Article 9(2)(f). But in all those cases, it’s going to be difficult for the controller to make the appropriate call on whether the request for information means that it’s necessary to make the disclosure, or whether it’s just a frivolous or aimless request.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, employment, judgments

Data protection claims against persons unknown

[reposted from my LinkedIn account]

Chirkunov v Person(s) Unknown & Ors [2024] EWHC 3177 (KB)

This is an important, and quite withering, judgment from Mr Justice Nicklin, which ends with a suggestion that from now on applications for permission to serve a Claim Form on ‘Persons Unknown’ out of the jurisdiction in claims in the Media & Communications List should not be dealt with without a hearing, unless a Master or Judge directs that a hearing is not necessary. The judgment records that, before hand down Nicklin J consulted the Judges in charge of the MAC List and they have endorsed his suggestion as the practice now to be followed in the MAC List.

The judgment is on an application to serve, out of the jurisdiction, a data protection claim on two persons unknown (the publishers of two websites said to infringe the data protection rights of the claimant). The claimant initially applied for orders to be made with a hearing, but Mrs Justice Steyn and gave directions for there to be a hearing.

Nicklin J was clearly unimpressed by the limited efforts the claimant and his lawyers had made to identify/locate the defendants, noting that the Norwich Pharmacal procedures had been available to the claimant, and concluded that “the Claimant has simply chosen not to pursue several avenues of investigation, including applications for Norwich Pharmacal relief. The basis for this decision is unpersuasive and unimpressive. On the evidence that has been provided, I am left with a very clear impression that the Claimant thought that he could avail himself of a simple short-cut – avoiding the cost of further investigations to identify the Defendants – by the expedient of issuing a claim against ‘Persons Unknown’”.

For this and other reasons the judge was also unwilling to give permission to serve out on persons unknown. Although such litigation can serve a purpose in some blackmail/cyber attack cases, for instance to “obtain interim remedies which can be used to counter the defendant’s threat to publish information that forms the basis of the blackmail/extortion threat”, he was not prepared to permit “litigation against someone who cannot be identified other than a description of his/her role, and with no indication of the state in which s/he is domiciled”.

Also notable was the judge’s approach to the part of the application which sought a declaration that the personal data on the website was inaccurate. The claimant was not “entitled” to such a declaration, and, in fact (Cleary v Marston Holdings and Aven v Orbis applied) declarations are not provided for under the data protection legislation and not generally granted in such litigation. The judge had “real difficulty in imagining the circumstances in which the Court would grant a declaration of “inaccuracy” in a data protection claim following a default judgment”.

The application was refused on all grounds.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, Norwich Pharmacal

Could the right to erasure in data protection law break AI?

[reposted from my LinkedIn account]

I ask this only partly in jest.

The story of how ChatGPT refused to acknowledge the existence of “David Mayer” and some others, perhaps (probably?) because people with those names had exercised their erasure rights (such as the right at Articles 17 of the GDPR and the UK GDPR), raises the interesting question: if a sufficient number of people made such requests, would the LLM begin to fail?

If so, a further question of rights arises. If I, Jon Baines, exercise my erasure right against ChatGPT (or another platform/LLM), and it suppresses any processing of the words “Jon Baines”, what effect might that have on my namesake Jon Baines, and his travel company? Or Jon the Ocean Specialist working on the Ocean Watch program?

Because the words “Jon Baines”, in isolation are not my personal data. In isolation, they do not relate to me. A crude response to an erasure request, just as with any of the other crude approaches which AI is capable of (for instance in relation to accuracy), runs the risk of interfering with others’ rights, including rights to operate a business, our rights to freedom of expression.

I don’t have an answer, but this is just one extra point and possible flaw in AI which will no doubt play out over the coming years.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under AI, Data Protection, erasure

The Data Protection Act 2018 does not “implement” the GDPR

They are separate instruments and the GDPR, pre-Brexit, did not require implementation – as a Regulation of the European Parliament and of the Council of the European Union, it had direct effect.

Since Brexit, by the effect of, among other laws, the European Union (Withdrawal Act) 2018 and the Data Protection, Privacy and Electronic Communications (Amendments Etc.) (EU Exit) Regulations 2019, we now have a retained-and-assimilated domestic version of the GDPR, called the UK GDPR.

Most processing of personal data is subject to the UK GDPR. The Data Protection Act 2018 deals with processing that is not subject to it, such as by law enforcement and security service agencies. It also provides some of the conditions and exemptions in relation to processing under the UK GDPR.

[None of this is new, and none of it will be unknown to genuine practitioners in the field, but I’m posting it here as a convenient sign to tap, at appropriate moments.]

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, UK GDPR

Banks to be required to snoop on customers’ accounts

[reposted from my LinkedIn account]

A recently announced “DWP Fraud, Error and Debt Bill” will propose obligations on banks and financial institutions to “examine their own data sets to highlight where someone may not be eligible for the benefits they are being paid” and share relevant information with the Department of Work and Pensions (DWP).

This appears to be a new approach to the broad powers which would have been conferred on the DWP under clause 131 and schedule 11 of the shelved Data Protection and Digital Information Bill. Under those provisions the DWP would have been able to require banks and financial institutions to give general access to customer accounts (rather than on a targeted basis) for the purpose of identifying benefit fraud. Although the proposed powers were subject to a fair deal of criticism on the grounds of disproportionality, they remained in the final version of the bill which would almost certainly have been enacted if Mr Sunak had called a later election.

The DWP Fraud, Error and Debt Bill (which has not yet been introduced into Parliament but will be this session – so probably by Spring 2025) will propose an “Eligibility Verification measure” which, in figurative terms, will result in server side snooping on accounts (i.e. by banks themselves) rather than the demand-side snooping the previous bill would have introduced.

We will have to wait for the details, but one thing is certain – this will require a lot of algorithmic automation, no doubt AI-driven, and the potential for errors will need to be addressed and mitigated.

It will also, surely, be a significant cost burden on banks and financial institutions. Whilst it’s generally hard to muster much sympathy in those circumstances, here we must consider the risk that the lowest-cost, highest-efficiency models which will be adopted may be the least protective of customers’ banking privacy and data protection rights.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, data sharing, DWP, Privacy

FOIA costs decision against applicant for failing to withdraw contempt application

A freedom of information requester is facing costs in what seems to have been a bit of a shambles before the First Tier Tribunal (FTT). I think this is rather concerning, albeit slightly convoluted, and, frankly, the whole thing is not assisted by a judgment that is strewn with errors and lacks coherence. In what follows I’ve had to piece together some of the information missing, or unclear, from the judgment.

It appears that the requester (AHB) had made a Freedom of Information Act 2000 (FOIA) request to the Royal Mint on 19 June (not July, as the FTT judgment says) 2021 for information about its “Garbled Coin Policy” in relation to repatriated UK currency. On 16 July 2021 The Royal Mint replied with what appears to have been a short narrative response. AHB complained to the Information Commissioner (ICO) on 28 September 2021, and ten months later the ICO held (very peremptorily, and rather oddly, I would say) that the Royal Mint held no information in relation to the original request.

AHB then appealed to the FTT and in a judgment of 3 October 2023 (the “2023 judgment”) the FTT held that the ICO had either or both erred in law, or in the exercise of his discretion, because the Royal Mint held further information in relation to the request. It issued a judgment constituting a substitute decision notice (SDN), under which the Royal Mint was ordered to issue a fresh decision within 35 days of the date on which the SDN was promulgated. The judgment specifically says “Failure to comply with this decision may result in the Tribunal making written certification of this fact pursuant to section 61 of the Freedom of Information Act 2000 and may be dealt with as a contempt of court”. The Royal Mint had chosen not to join itself to those proceedings and neither AHB nor the ICO had applied for it to be joined.

It is not at all clear, from the judgment, what happened next, but it appears that the SDN, with its Order that the Royal Mint issue a fresh response, was not served on the Royal Mint itself (presumably this error arose from its not having been a party, although it was aware of the proceedings). Then, on 9 December 2023, having received no fresh response, and no doubt taking his cue from the SDN, AHB made an application to the FTT under section 61(4) of FOIA for the Royal Mint to be certified to the Upper Tribunal for contempt of court.

It appears that the FTT finally served the SDN on the Royal Mint on 22 December 2023 (the judgment at several points has this as the obviously impossible “22 December 2024”).

One assumes, at this point, that, although the SDN was not served on the Royal Mint until the time of 35 days from 3 October 2023 had already passed, the Order in the SDN still had effect. That being the case, it appears to have been incumbent on the Royal Mint’s lawyers to make an urgent application, for instance for compliance with the Order to be waived, for relief from sanctions and for a new date for compliance to be set. Instead, they did not take action until 3 January 2024, when they wrote to the FTT suggesting that a response would be provided within a further 35 days. However, this was just correspondence – no actual application was made.

Eventually, a response was issued by the Royal Mint in relation to the SDN, on 5 February 2024, more than two-and-a-half years after AHB made his request.

AHB’s application for a contempt certification was still live though, and here I pause to observe that, on the information available, I am not surprised he took no action to withdraw it. He had been vindicated by the FTT’s SDN of 3 October 2023, and he was unaware that the SDN had erroneously not been served on the Royal Mint (in fact, it is not at all clear at what point he did become aware of this). In any case, as no application was made by the Royal Mint for further time, the Order in the SDN must still have been in effect. In fact the judgment alludes to this when it notes that AHB was “indicating” in his contempt application that the final Royal Mint response “was provided 125 days after the Substituted Decision Notice was issued and 90 days later than directed”.

In any event, the FTT declined to certify the failure to comply on time as contempt, because

whilst the Tribunal does consider that the Respondent could have acted more diligently on becoming aware of the Substituted Decision Notice, by applying for an extension of time and requesting permission to extend the time set out in the SDN, the Tribunal does not consider that [the Royal Mint’s lawyer] wilfully avoided complying with the order. The Tribunal accepts that he was simply not aware of the appropriate course of action to take in circumstances where a Court or Tribunal imposed a deadline that had already been missed. In any event, the approach taken is not sufficiently serious to warrant certification to the Upper Tribunal for contempt and the application is refused. [emphasis added]

I will pause here to say that it’s unusual, to say the least, for a court to accept a submission that a solicitor was not aware of what to do when in receipt of an order of a court. Most judges would be quite intolerant of such an argument.

But the story does not end there. In submissions dated 17 July 2024 the Royal Mint then “indicated an intention to pursue an application for the costs ‘of and associated with’ the [contempt] application”. Under rule 10 of The Tribunal Procedure (First-tier Tribunal) (General Regulatory Chamber) Rules 2009 the FTT may make an order in respect of costs but only if it considers that a party has acted unreasonably in bringing, defending or conducting the proceedings.

And, remarkably, the FTT acceded to the costs application, on the grounds that AHB did not withdraw his application for the FTT to certify the Royal Mint’s (undoubted) failure to comply with the 3 October 2023 Order, after he had finally received the fresh response of 5 February 2024. The FTT also took into account AHB’s reference to pursuing a “campaign” to encourage greater transparency.

But does this mean AHB has “acted unreasonably in…conducting the proceedings”? I’m far from convinced (in fact, I’m not convinced). The FTT says

The Tribunal does not consider that it is reasonable (or that any other reasonable person would consider it reasonable) for an application for a party to be certified to the Upper Tribunal for contempt of court to be used as part of a campaign to encourage greater transparency…The Tribunal considers that the obligation to deal with cases fairly, justly, and proportionately in circumstances where the Applicant accepts that he was in appropriately [sic] pursuing a “campaign” for other purposes and where the chances of success in relation to the Tribunal actually certifying the contempt may be limited may justify the making of a costs order against the Applicant.

Well, if I’m to be considered a reasonable person, then I do not think it unreasonable for a person to decide not to withdraw such an application where they have waited more than two-and-half years for an answer from a public authority to a simple FOIA request, and where the public authority has failed to comply with an Order, because its lawyer chose not to acquaint himself with procedural rules. Unreasonableness imposes a very high threshold, and this is shown by the fact that costs awards are extraordinarily rare in FOIA cases in the FTT (from my research I have only found two, in the twenty-odd years FOIA has been in effect, and one of those was overturned on appeal). AHB may have been tenacious, perhaps overly so, and he may have ancillary reasons for (some of) his conduct, but – again – that does not connote unreasonableness.

Costs have not yet been awarded, as the FTT has adjourned for submissions on AHB’s means, and a breakdown of the Royal Mint’s costs.

I should end by saying there may be other material not in the public domain which provides a gloss on AHB’s conduct of the proceedings, but one can (and must) only go on what is in the public domain.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, contempt, costs, FOIA, Freedom of Information, Information Commissioner, Information Tribunal, judgments