Category Archives: Data Protection

Why is the ICO so quiet about prosecutions?

Not infrequently, I get contacted (personally and professionally) by individuals who are concerned that their personal data has been compromised in circumstances that may constitute the criminal offence of “obtaining” or “retaining”, under section 170 of the Data Protection Act 2018.

In many cases, there is not much I can bring to the table. If an offence has been committed then this is a matter for the prosecutor. Normally, for data protection offences, this is the Information Commissioner’s Office.

But what strikes me is that there appears to be no information on the ICO website for anyone who wants to report an alleged or potential offence. Their “For the public” pages don’t cover the scenario, and all of the data protection complaints information there is predicated on the assumption that the individual will be complaining about the data controller’s compliance (whereas, in a section 170 offence, the controller is more of the status of “victim”).

In fact, the best I can find is one brief reference (at page 61) of a lengthy guide to the DPA 2018, aimed at “organisations and individuals who are already familiar with data protection law”, and which doesn’t even actually explain that the offences described can be prosecuted by the ICO.

Dr David Erdos has recently highlighted both the low number of ICO prosecutions, and the rather slapdash way in which the ICO appears to be handling information about them. But the section 170 provisions are criminal ones for a reason: they will sometimes involve the most distressing and serious interferences with people’s data protection and privacy rights.

Surely the ICO should pay more attention to such incidents, and assist concerned data subjects (or others) who might want to report potential offences?

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, Information Commissioner, offences

Concerns over the Public Authorities (Fraud, Error and Recovery) Bill

When it comes to proposed legislation, most data protection commentary has understandably been on the Data (Use and Access) Bill, but it’s important also to note some of the provisions of the Public Authorities (Fraud, Error and Recovery) Bill, introduced in the Commons on 25 January.

The abandoned Tory Data Protection and Digital Information Bill would have conferred powers on the DWP to inspect bank accounts for evidence of fraud. To his credit, the Information Commissioner John Edwards, in evidence given on that earlier Bill, had warned about the “significant intrusion” those powers would have created, and that he had not seen evidence to assure him that they were proportionate. This may be a key reason why they didn’t reappear in the DUA Bill.

The Public Authorities (Fraud, Error and Recovery) Bill does, however, at clause 74 and schedule 3, propose that the DWP will be able to require banks to search their own data to identify whether recipients of Universal Credit, ESA and Pension Credit meet criteria for investigation for potential fraud.

But such investigative powers are only as good as the data, and the data governance, in place. And as the redoubtable John Pring of Disability News Service reports, many disabled activists are rightly concerned about the potential for damaging errors. In evidence to the Bill Committee one activist noted that “even if there was an error rate of just 0.1 per cent during this process, that would still mean thousands of people showing up as ‘false positives’, even if it just examined those on means-tested benefits”.

The Bill does not appear to confer any specific role on the Information Commissioner in this regard, although there will be an independent reviewer, and – again, creditably – the Commissioner has said that although he could not be the reviewer himself, he would expect to be involved.

It is worth also reading the concerns of the Public Law Project, contained in written evidence to the Bill committee.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, data sharing, Information Commissioner

Clarity needed on NHS publication of reports into homicides

[reposted from my LinkedIn account]

Does the law need clarifying on the publication of reviews into homicides by those receiving mental health services from the NHS?

The Times led recently on stories that NHS England was refusing to publish the full independent report into the health care and treatment of Valdo Calocane prior to his manslaughter of three people in Nottingham in 2023. NHSE apparently argued that data protection and patient confidentiality concerns prevented them publishing anything but a summary. Under pressure from victims’ families, and the media, NHSE about-turned, and the full report is reported to contain damning details of failings in Calocane’s treatment which were not in the summary version.

Now The Times reports that this is part of a pattern, since last year, of failure to publish full reviews of homicides by mental health patients, contrary to previous practice. It says that NHSE received legal advice that the practice “could breach data protection rules and the killers’ right to patient confidentiality”. The charity Hundred Families talks of cases where the names of victims are not published, or even the identity of the NHS Trust involved.

Of course, without seeing the advice, it is difficult to comment with any conviction, but I did write in recent days about how the law can justify publication where it is “necessary for a protective function” such as exposing malpractice, or failures in services. And it’s important to note that, in many cases, such reports show failings that mean that killers themselves have been let down by the adequacy of treatment: publication can surely, in some cases, cast light on this so that similar failings don’t happen in the future. In any case, guidance says that those preparing reports should do so with a view to their being published, and so confidentiality concerns should be taken into account in the drafting.

However, if NHSE remains concerned about the legality of publication, and if its legal advice continues to say that data protection and medical confidentiality law militated against disclosure, it strikes me that this might call for Parliament to legislate. I also believe that it would be welcomed if the Information Commissioner’s Office issued a statement on the legal issues arising.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Confidentiality, Data Protection, Information Commissioner, NHS

Is the legal sector really suffering a flood of databreaches?

[reposted from my LinkedIn account]

There have been various articles in the media recently, reporting a significant rise in personal data breaches reported by the legal sector to the Information Commissioner’s Office. I have some real doubts about the figures.

An example article says

A new analysis of data from the Information Commissioner’s Office (ICO) by NetDocuments has revealed a sharp increase in data breaches across the UK legal sector. In the period between Q3 2023 and Q2 2024, the number of identified data breaches in the UK legal sector rose by 39% (2,284 cases were reported to the ICO, compared to 1,633 the previous year)

But something didn’t seem right about those numbers. The ICO say that they have received 60,607 personal data breach reports since their current reporting methods began in Q2 2019 (see their business intelligence visualised database), so it seemed remarkable to suggest that the legal sector was scoring so highly. And, indeed, when I look at the ICO BI data for self-reported personal data breaches, filtered for the legal sector, I see only 197 reported in Q3 2023, and, coincidentally, 197 in Q2 2024 (see attached visuals) – an increase from one relatively low number to another relatively low number of precisely 0%.

A serious question to those more proficient with data than I am – am I missing something?

If I’m not, I really think the ICO should issue some sort of corrective statement.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, data security, Information Commissioner, personal data breach

NHS England and publication of the Calocane report

[reposted from my LinkedIn account]

[Edited to add: the day following the upload of this post NHS England did an about turn, and published the report in full, saying “The NHS has taken the decision to publish the report in full in line with the wishes of the families and given the level of detail already in the public domain”]

NHS England is reported to be refusing, partly on data protection grounds, to publish the full independent review report into the care and treatment of Valdo Calocane prior to his manslaughter of three people in Nottingham in 2023.

The report is said to be over 200 pages long, and although a summary will be published, families of the victims are calling for the full report (which they only saw after pressure from their lawyers) to be published on public interest grounds, saying “we have grave concerns about the conduct of the NHS”.

So does data protection law prevent disclosure?

The report will clearly contain details of Calocane’s health, and as such it constitutes a special category of personal data, requiring a condition for processing from Article 9 of the UK GDPR. The most likely candidate would be Article 9(2)(g):

processing is necessary for reasons of substantial public interest, on the basis of domestic law….

The domestic law provisions referred to are contained in schedule 1 to the Data Protection Act 2018. And at first glance, it is not straightforward to identify a provision which would permit disclosure.

However, paragraph 11 potentially does. It deals with processing which is necessary for a “protective function”, must be carried without the consent of the data subject so as not to prejudice that protective function and which is necessary for reasons of substantial public interest. A “protective function” includes a function which is intended to protect members of the public against failures in services provided by a body or association.

Reports into homicides by patients in receipt of mental health care are commissioned by NHS England under the Serious Incident Framework “Supporting learning to prevent recurrence”, and this says that “publication of serious incident investigation reports and action plans is considered best practice”, although “reports should not contain confidential personal information unless…there is an overriding public interest”.

I’m not saying it’s a straightforward legal question, as to whether the report can be published, but an argument can be made that there is a substantial, overriding, public interest in disclosure in order that the public can be aware of any failings and understand what actions are being taken to address them. No doubt though that NHS England’s argument would be that this is achieved by publication of the summary report.

I imagine, in any case, that freedom of information requests will be made for the full report, so ultimately we may see the Information Commissioner’s Office, and maybe the courts, rule on this.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, Data Protection Act 2018, NHS, UK GDPR

Exceptionally unlikely: ICO and judicial review

[reposted from my LinkedIn account]

Where Parliament has entrusted a specialist body with bringing prosecutions, such as the Serious Fraud Office, or the Information Commissioner’s Office (ICO), it is “only in highly exceptional circumstances” that a court will disturb a decision made by that body (see Lord Bingham in R(Corner House and others) v Director of the Serious Fraud Office [2008] UKHL 60)).

Such was the situation faced by the claimant in an unsuccessful recent application for judicial review of two decisions of the ICO.

The claimant, at the time of the events in question, was a member of the Labour Party and of the Party’s “LGBT+Labour” group, She had been concerned about an apparent disclosure of the identity and trans status of 120 members of a “Trans Forum” of the group, of which she was also a member, and of what she felt was a failure by the LGBT+Labour group to inform members of the Forum of what had happened.

She reported this to the ICO as potential offences under sections 170 and 173 of the Data Protection Act 2018 (it’s not entirely clear what specific offences would have been committed), and she asked whether she was “able to discuss matters relating to potential data breaches with the individuals involved”. The ICO ultimately declined to prosecute, and also informed her that disclosing information to the individuals could in itself “potentially be a section 170 offence”.

The application for judicial review was i) in respect of the “warning” about a potential prosecution in the event she disclosed information to those data subjects, and her subsequent rejected request for a commitment that she would not be prosecuted, and ii) in respect of the decision not to prosecute LGBT+Labour.

Neither application for permission succeeded. In the first case, there was no decision capable of being challenged: it was an uncontroversial statement by the ICO about a hypothetical and fact-sensitive future situation, and in any event she was out of time in bringing the application. In the second case, there were no “highly exceptional circumstances” that would enable the court “to consider there was a realistic prospect of showing that the ICO had acted outside the wide range of its discretion when deciding not to prosecute”.

One often sees suggestions that the ICO should be JRd over its failure to take action (often in a civil context). This case illustrates the deference that the courts will give to its status and expertise both as regulator and prosecutor. Outside the most exceptional of cases, such challenges are highly unlikely to succeed.

Peto v Information Commissioner [2025] EWHC 146 (Admin)

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under crime, Data Protection, Data Protection Act 2018, Information Commissioner, judgments, judicial review

Consent is not the only basis

In 2017 I attended a free event run by a “GDPR consultancy”. The presenter confidently told us that we were going to have to get consent from customers in order to process their personal data. One attendee said they worked at the DWP, so how were they going to get consent from benefits claimants who didn’t want to disclose their income, to which the presenter rather awkwardly said “I think that’s one you’ll have to discuss with your lawyers”. Another attendee, who was now most irritated that he’d taken time out from work for this, could hold his thoughts in no longer, and rudely announced that this was complete nonsense.

That attendee was the – much ruder in those days – 2017 version of me.

I never imagined (although I probably should have done) that eight years on the same nonsense would still be spouted.

Just as the Data Protection Act 2018 did not implement the GDPR in the UK (despite the embarrassing government page that until recently, despite people raising it countless times, said so) just as the GDPR does not limit its protections to “EU citizens”, so GDPR and the UK GDPR do not require consent for all processing.

Anyone who says so has not applied a smidgeon of thought or research to the question, and is probably taking content from generative AI, which, on the time-honoured principle of garbage-in, garbage-out, has been in part trained on the existing nonsense. To realise why it’s garbage, they should just start with the DWP example above and work outwards from there.

Consent is one of the six lawful bases, any one or more of which can justify processing. No one basis is better than or takes precedence over the other.

To those who know this, I apologise for having to write it down, but I want to have a sign to tap for any time I see someone amplifying the garbage on LinkedIn.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, DWP, GDPR, Let's Blame Data Protection, UK GDPR

Cookies, compliance and individuated consent

[reposted from my LinkedIn account]

Much will be written about the recent High Court judgment on cookies, direct marketing and consent, in RTM v Bonne Terre & Anor, but treat it all (including, of course, this, with caution).

This was a damages claim by a person with a gambling disorder. The claim was, in terms, that the defendant’s tracking of his online activities, and associated serving of direct marketing, were unlawful, because they lacked his operative consent, and they led to damage because they caused him to gamble well beyond his means. The judgment was only on liability, and at the time of writing this post there has been no ruling on remedy, or quantum of damages.

The domestic courts are not regulators – they decide individual cases, and where a damages claim is made by an individual any judicial analysis is likely to be highly fact specific. That is certainly the case here, and paragraphs 179-181 are key:

such points of criticism as can be made of [the defendant’s] privacy policies and consenting mechanisms…are not made wholesale or in a vacuum. Nor are they concerned with any broader question about best practice at the time, nor with the wisdom of relying on this evidential base in general for the presence of the consents in turn relied on for the lawfulness of the processing undertaken. Such general matters are the proper domain of the regulators.

In this case, the defendant could not defeat a challenge that in the case of this claimant its policies and consenting mechanisms were insufficient:

If challenged by an individual data subject, a data controller has to be able to demonstrate the consenting it relies on in a particular case. And if that challenge is put in front of a court, a court must decide on the balance of probabilities, and within the full factual matrix placed before it, whether the data controller had a lawful consent basis for processing the data in question or not.

Does this mean that a controller has to get some sort of separate, individuated consent for every data subject? Of course not: but that does not mean that a controller whose policies and consenting mechanisms are adequate in the vast majority of cases is fully insulated from a specific challenge from someone who could not give operative consent:

In the overwhelming majority of cases – perhaps nearly always – a data controller providing careful consenting mechanisms and good quality, accessible, privacy information will not face a consent challenge. Such data controllers will have equipped almost all of their data subjects to make autonomous decisions about the consents they give and to take such control as they wish of their personal data…But all of that is consistent with an ineradicable minimum of cases where the best processes and the most robust evidential provisions do not, in fact, establish the necessary presence of autonomous decision-making, because there is specific evidence to the contrary.

This is, one feels, correct as a matter of law, but it is hardly a happy situation for those tasked with assessing legal risk.

And the judgment should (but of course won’t) silence those who promise, or announce, “full compliance” with data protection and electronic marketing law.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adtech, consent, cookies, Data Protection, GDPR, judgments, marketing, PECR, Uncategorized

Disclosing details of successful candidates from jobs

[reposted from my LinkedIn account]

Jones v Secretary of State for Health And Social Care [2024] EWCA Civ 1568

A question for data protection advisers. If you are asked by an unsuccessful candidate for a job what the age, gender and ethnic origin of the successful candidate was, do you disclose? (And what is your Article 6 basis and Article 9 UK GDPR condition for doing so?)

These questions are prompted by an interesting employment case in the Court of Appeal.

The appellant, who self-describes as black Caribbean, interviewed for a business development role at Public Health England (PHE) on 28 March 2019 but was not told, despite chasing, until 3 July 2019 that he had been unsuccessful. This was already outside the primary three month limitation period for bringing a claim in the employment tribunal (ET).

He then asked PHE for “age, gender and ethnic origin” of the successful candidate, and explained he needed to information to decide whether or not to make a claim in the ET.

It is not entirely clear what then happened: it’s suggested that PHE initially refused, but told the claimant he could make an FOI request, and there is also a suggestion that he was told that if he provided proof of his identity they would provide the information. In any event, he was not informed until much later in the proceedings that the successful candidate was white British.

His ET claim for discrimination was, therefore, submitted out of time. The ET can only extend the time for such a claim where it is “just and equitable” to do so, and, here, the ET held that it was not: he put off making his claim “because he was on an information gathering exercise. He was looking for the evidence to bolster his claim…Despite the Claimant’s criticisms, the respondent did in fact provide him with information and an explanation of its actions quite early on in the chronology. It gave him enough information to know that there was a claim for him to make if he wanted to present it to the Tribunal”. And, in any case, the ET dismissed the claim on its merits.

On appeal to the Employment Appeal Tribunal (EAT) the claimant submitted that it had been perverse of the ET to refuse to exercise its discretion to extend the time for making the application, but the EAT held that the ET had made no error of law in that regard.

The Court of Appeal felt differently; it was wrong for the ET to have held that the claimant had had, much earlier, the “raw materials” on which to formulate his claim, and it although it was correct that he was looking for information to bolster his claim, this ought not to have been held against him. “The information he was seeking about the ethnicity of the successful candidate was an essential part of his claim”.

Accordingly, the ET’s decision not to extend time under the “just and equitable” test was perverse, and the order of the EAT to uphold that decision was set aside, and the case on merits was remitted to the EAT.

And I guess my answer to my own questions at the start of this post would be: one or both of Articles 6(1)(c) and 6(1)(f), and Article 9(2)(f). But in all those cases, it’s going to be difficult for the controller to make the appropriate call on whether the request for information means that it’s necessary to make the disclosure, or whether it’s just a frivolous or aimless request.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, employment, judgments

Data protection claims against persons unknown

[reposted from my LinkedIn account]

Chirkunov v Person(s) Unknown & Ors [2024] EWHC 3177 (KB)

This is an important, and quite withering, judgment from Mr Justice Nicklin, which ends with a suggestion that from now on applications for permission to serve a Claim Form on ‘Persons Unknown’ out of the jurisdiction in claims in the Media & Communications List should not be dealt with without a hearing, unless a Master or Judge directs that a hearing is not necessary. The judgment records that, before hand down Nicklin J consulted the Judges in charge of the MAC List and they have endorsed his suggestion as the practice now to be followed in the MAC List.

The judgment is on an application to serve, out of the jurisdiction, a data protection claim on two persons unknown (the publishers of two websites said to infringe the data protection rights of the claimant). The claimant initially applied for orders to be made with a hearing, but Mrs Justice Steyn and gave directions for there to be a hearing.

Nicklin J was clearly unimpressed by the limited efforts the claimant and his lawyers had made to identify/locate the defendants, noting that the Norwich Pharmacal procedures had been available to the claimant, and concluded that “the Claimant has simply chosen not to pursue several avenues of investigation, including applications for Norwich Pharmacal relief. The basis for this decision is unpersuasive and unimpressive. On the evidence that has been provided, I am left with a very clear impression that the Claimant thought that he could avail himself of a simple short-cut – avoiding the cost of further investigations to identify the Defendants – by the expedient of issuing a claim against ‘Persons Unknown’”.

For this and other reasons the judge was also unwilling to give permission to serve out on persons unknown. Although such litigation can serve a purpose in some blackmail/cyber attack cases, for instance to “obtain interim remedies which can be used to counter the defendant’s threat to publish information that forms the basis of the blackmail/extortion threat”, he was not prepared to permit “litigation against someone who cannot be identified other than a description of his/her role, and with no indication of the state in which s/he is domiciled”.

Also notable was the judge’s approach to the part of the application which sought a declaration that the personal data on the website was inaccurate. The claimant was not “entitled” to such a declaration, and, in fact (Cleary v Marston Holdings and Aven v Orbis applied) declarations are not provided for under the data protection legislation and not generally granted in such litigation. The judge had “real difficulty in imagining the circumstances in which the Court would grant a declaration of “inaccuracy” in a data protection claim following a default judgment”.

The application was refused on all grounds.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, Norwich Pharmacal