Category Archives: data sharing

Data Protection risks to life: Should more be done?

I’ve written up my thoughts for the Mishcon de Reya website, on the baffling decision by the ICO to take no action in response to the most catastrophic data breach in UK history, which exposed many thousands of people to immediate risk to their lives.

https://www.mishcon.com/news/data-protection-risks-to-life-should-more-be-done

Leave a comment

Filed under Data Protection, Data Protection Act 2018, data sharing, Information Commissioner, Ministry of Defence, UK GDPR

Oral disclosure of personal data: a new domestic case

“Pretexting” and “blagging” are forms of social engineering whereby someone attempts to extract information from a source by deception. One (unethical) example is when a journalist purports to be someone else in order to gather information for a story.

A recent misuse of private information and data protection judgment in the High Court deals with a different, and sadly not uncommon, example – where an estranged, abusive partner convinced a third party to give information about their partner so they can continue their harassment of them.

The claimant had worked at a JD Wetherspoon pub, but had left a few months previously. She had given her contact details, including her mother’s mobile phone number, to her manager, and the details were kept in a paper file, marked “Strictly Private and Confidential”, in a locked filing cabinet. During the time she was employed she had been the victim of offences by a former partner of serious violence and harassment which involved subjecting her to many unwanted phone calls. He was ultimately convicted of these and sentenced to 2 ½ years in prison. Her employer was aware of the claimant’s concerns about him.

While her abuser was on remand, he rang the pub, pretending to be a police officer who needed to contact the claimant urgently. Although the pub chain had guidance on pretexting, under which such attempts to acquire information should be declined initially and referred to head office, the pub gave out the claimant’s mother’s number to the abuser, who then managed to speak to (and verbally abuse) the claimant, causing understandable distress.

She brought claims in the county court in misuse of private information, breach of confidence and for breach of data protection law. She succeeded at first instance with the first two, but not with the data protection claim. Wetherspoons appealed and she cross-challenged, not by appeal but by way of a respondent’s notice, the rejection of the data protection claim.

In a well-reasoned judgment in Raine v JD Wetherspoon PLC [2025] EWHC 1593 (KB), Mr Justice Bright dismissed the defendant’s appeals. He rejected their argument that the Claimant’s mother’s mobile phone number did not constitute the Claimant’s information or alternatively that it was not information in which she had a reasonable expectation of privacy: it was not ownership of the mobile phone that mattered, nor ownership of the account relating to it – what was relevant was information: the knowledge of the relevant digits. As between the claimant and the defendant, that was the claimant’s information, which was undoubtedly private when given to the defendants and was intended to remain private, rather than being published to others.

The defendant then argued that there can be no cause of action for misuse of private information if the Claimant is unable to establish a claim under the DPA/GDPR, and, relatedly, that a data security duty could not arise under the scope of the tortious cause of action of misuse of private information. In all honesty I struggle to understand this argument, at least as articulated in the judgment, probably because, as the judge suggests, this was not a data security case involving failure to take measures to secure the information. Rather, it involved a positive act of misuse: the positive disclosure of the information by the defendant to the abuser.

The broadly similar appeal grounds in relation to breach of confidence failed, for broadly similar reasons.

The counter challenge to the prior dismissal of the data protection claim, by contrast, succeeded. At first instance, the recorder had accepted the defendant’s argument that this was a case of purely oral disclosure of information, and that, applying Scott v LGBT Foundation Limited, this was not “processing” of “personal data”. However, as the judge found, in Scott,

the information had only ever been provided to the defendant orally; and…then retained not in electronic or manual form in a filing system, but only in the memory of the individual who had received the original oral disclosure…In that case, there was no record, and no processing. Here, there was a record of the relevant information, and it was processed: the personnel file was accessed by [the defendant’s employee], the relevant information was extracted by her and provided in written form to [another employee], for him to communicate to [the abuser].

This fell “squarely within the definition of ‘processing’ in the GDPR at article 4(2)”. Furthermore, there was judicial authority in Holyoake v Candy that, in some circumstances, oral disclosure will constitute processing (a view supported by the European Court in Endemol Shine Finland Oy).

Damages for personal injury, in the form of exacerbation of existing psychological damage, of £4500 were upheld.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Breach of confidence, Data Protection, data sharing, GDPR, judgments, misuse of private information, Oral disclosure

Concerns over the Public Authorities (Fraud, Error and Recovery) Bill

When it comes to proposed legislation, most data protection commentary has understandably been on the Data (Use and Access) Bill, but it’s important also to note some of the provisions of the Public Authorities (Fraud, Error and Recovery) Bill, introduced in the Commons on 25 January.

The abandoned Tory Data Protection and Digital Information Bill would have conferred powers on the DWP to inspect bank accounts for evidence of fraud. To his credit, the Information Commissioner John Edwards, in evidence given on that earlier Bill, had warned about the “significant intrusion” those powers would have created, and that he had not seen evidence to assure him that they were proportionate. This may be a key reason why they didn’t reappear in the DUA Bill.

The Public Authorities (Fraud, Error and Recovery) Bill does, however, at clause 74 and schedule 3, propose that the DWP will be able to require banks to search their own data to identify whether recipients of Universal Credit, ESA and Pension Credit meet criteria for investigation for potential fraud.

But such investigative powers are only as good as the data, and the data governance, in place. And as the redoubtable John Pring of Disability News Service reports, many disabled activists are rightly concerned about the potential for damaging errors. In evidence to the Bill Committee one activist noted that “even if there was an error rate of just 0.1 per cent during this process, that would still mean thousands of people showing up as ‘false positives’, even if it just examined those on means-tested benefits”.

The Bill does not appear to confer any specific role on the Information Commissioner in this regard, although there will be an independent reviewer, and – again, creditably – the Commissioner has said that although he could not be the reviewer himself, he would expect to be involved.

It is worth also reading the concerns of the Public Law Project, contained in written evidence to the Bill committee.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, data sharing, Information Commissioner

Banks to be required to snoop on customers’ accounts

[reposted from my LinkedIn account]

A recently announced “DWP Fraud, Error and Debt Bill” will propose obligations on banks and financial institutions to “examine their own data sets to highlight where someone may not be eligible for the benefits they are being paid” and share relevant information with the Department of Work and Pensions (DWP).

This appears to be a new approach to the broad powers which would have been conferred on the DWP under clause 131 and schedule 11 of the shelved Data Protection and Digital Information Bill. Under those provisions the DWP would have been able to require banks and financial institutions to give general access to customer accounts (rather than on a targeted basis) for the purpose of identifying benefit fraud. Although the proposed powers were subject to a fair deal of criticism on the grounds of disproportionality, they remained in the final version of the bill which would almost certainly have been enacted if Mr Sunak had called a later election.

The DWP Fraud, Error and Debt Bill (which has not yet been introduced into Parliament but will be this session – so probably by Spring 2025) will propose an “Eligibility Verification measure” which, in figurative terms, will result in server side snooping on accounts (i.e. by banks themselves) rather than the demand-side snooping the previous bill would have introduced.

We will have to wait for the details, but one thing is certain – this will require a lot of algorithmic automation, no doubt AI-driven, and the potential for errors will need to be addressed and mitigated.

It will also, surely, be a significant cost burden on banks and financial institutions. Whilst it’s generally hard to muster much sympathy in those circumstances, here we must consider the risk that the lowest-cost, highest-efficiency models which will be adopted may be the least protective of customers’ banking privacy and data protection rights.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, data sharing, DWP, Privacy

Regulators’ powers to get information from public authorities

[reposted from LinkedIn]

“The contents of this judgment stand as a salutary warning to local authorities and to other public bodies concerned with fitness to practise in occupations concerned with or touching on the welfare of children…I hope what took place in this case will not happen again.”

These are the concluding words of General Dental Council v KK & Anor [2024] EWHC 3053 (Fam), a stinging judgment of Mrs Justice Knowles in the Family Court.

During the course of a fitness to practise (FTP) investigation involving “KK” the General Dental Council (GDC) were informed by the police that a local authority, Stockport Council (SMBC), held information relating to public law proceedings involving the care of children. Under s12(1)(a) of the Administration of Justice Act 1960 (AJA) it is criminal contempt to disclose much of the information in relation to such proceedings. Notwithstanding this (and apparently ignorant of it) the GDC made a request for information to SMBC, alluding to its powers to require supply of information under s33B(2) of The Dentists Act 1984 (DA). SMBC subsequently disclosed a raft of information to the GDC, apparently under the belief that s33B(2) compelled them to do so.

However, s33B(3) of the DA states that “nothing in this section shall require or permit any disclosure of information which is prohibited by any relevant enactment”: section 12(1)(a) of the AJA clearly was a “relevant enactment” prohibiting the disclosure.

The upshot was an unholy mess: the information disclosed was used by the GDC in the FTP proceedings and it was almost four years into those protracted proceedings before the issue of the unlawfully disclosed information came to light. An application by GDC to the Family Court for an order for (lawful) disclosure was made, and Knowles J indicated during the initial hearing that contempt proceedings would not be necessary or proportionate “subject to the Court being satisfied that all unauthorised material disclosed to the GDC had been deleted from its server and was no longer in its possession”. By that point though, in the FTP proceedings, there had been (deep breath) “11 interim order hearings; 4 extensions…in the High Court; 3 substantive hearings before the Professional Conduct Committee and 6 preliminary hearings before the Practice Committee…and at least 149 individuals needed to be contacted to ascertain whether they held unauthorised disclosure arising from their work on behalf of the GDC.” The deletion process took approximately six months.

Ultimately, appropriate, restricted and lawful disclosure by SMBC was ordered. Contempt proceedings against GDC and SMBC (and individuals) were not necessary.

GDC and SMBC jointly have to meet KK’s costs, and – although the judgment records that both have since initiated training/protocols etc to prevent any recurrence – in the words of the judge, “both have been shamed by what occurred”.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, data sharing, judgments, regulatory law

John Edwards evidence to the Angiolini inquiry

On 29 February Lady Elish Angiolini published the first report from her inquiry into how off-duty Metropolitan police officer Wayne Couzens was able to abduct, rape and murder Sarah Everard.

Information Commissioner John Edwards contributed to the inquiry, and his evidence is cited at 4.320 (the paragraph is quoted below). It deals with the profoundly important (and perennially misunderstood) issue of data-sharing within and between police forces.

Although for obvious reasons the identity and content of some witness evidence to the inquiry is being kept anonymous, there should be no obvious reason that Mr Edwards’s is, and I hope that the Information Commissioner’s Office will, in addition to publishing his press statement, also publish any written evidence he submitted. It would also be good to know the details of the work Mr Edwards says his office is doing, and continuing, with the police, in this context.

In discussions with senior leaders of relevant organisations, the Inquiry was told that gaps in information-sharing between human resources, recruitment, professional
standards and vetting teams – and, indeed, between forces themselves – were a
significant barrier to capturing a clear picture of officers. The Inquiry heard from different sources, including senior leaders, that there are significant barriers to
information-sharing. Some cite data privacy and protection laws as a reason not to
share information. However, in a discussion with the Information Commissioner, John Edwards, the Inquiry was assured that data protection law recognises that there are legitimate reasons for information-sharing, particularly given the powers attributed to police officers. Indeed, Mr Edwards suggested that data protection law is widely misunderstood and misconstrued, and highlighted a failure of training in this regard.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, data sharing, Information Commissioner, police

How did George Galloway come to send different canvassing info to different electors?

As electors went to the polls in the Rochdale by-election on 29 February, a few posts were made on social media showing the disparity between letters sent to different electors by candidate George Galloway. An example is here

On the face of it, Galloway appears to have hoped to persuade Muslim voters to vote for him based on his views on a topic or topics he felt would appeal to them, and others to vote for him based on his views on different topics.

It should be stressed that there is nothing at all wrong that in principle.

What interests me is how Galloway identified which elector to send which letter to.

It is quite possible that a candidate might identify specific roads which were likely to contain properties with Muslim residents. And that, also would not be wrong.

But an alternative possibility is that a candidate with access to the full electoral register, might seek to identify individual electors, and infer their ethnicity and religion from their name. A candidate who did this would be processing special categories of personal data, and (to the extent any form of automated processing was involved) profiling them on that basis.

Article 9(1) of the UK GDPR introduces a general prohibition on the processing of special categories of personal data, which can only be set aside if one of the conditions in Article 9(2) is met. None of these immediately would seem available to a candidate who processes religious and/or ethnic origin data for the purposes of sending targeted electoral post. Article 9(2)(g) provides a condition for processing necessary for reasons of substantial public interest, and Schedule One to the Data Protection Act 2018 gives specific examples, but, again, none of these would seem to be available: paragraph 22 of the Schedule permits such processing by a candidate where it is of “personal data revealing political opinions”, but there is no similar condition dealing with religious or ethnic origin personal data.

If such processing took place in contravention of the prohibition in Article 9, it would be likely to be a serious infringement of a candidate’s obligations under the data protection law, potentially attracting regulatory enforcement from the Information Commissioner, and exposure to the risk of complaints or legal claims from electors.

To be clear, I am not saying that I know how Galloway came to send different letters to different electors, and I’m not accusing him of contravening data protection law. But it strikes me as an issue the Information Commissioner might want to look into.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under access to information, Data Protection, Data Protection Act 2018, data sharing, Information Commissioner, political parties, UK GDPR

New Model Clauses – a Mishcon podcast

My colleagues, partners Adam Rose and Ashley Winton, discuss the new European Commission Standard Contractual Clauses announced on 4 June 2021. I honestly can’t think of two better people to discuss what they mean.

Initial Reactions: New Standard Contractual Clauses (mishcon.com)

Leave a comment

Filed under adequacy, Brexit, consistency, Data Protection, data sharing, EDPB, Europe, GDPR, international transfers, Schrems II

ICO not compliant with post-Schrems II data protection law?

In which I finally receive a reply to my complaint about ICO’s Facebook page.

The issue of the transfer of personal data to the US has been the subject of much debate and much litigation. In 2015 the Court of Justice of the European Union (CJEU) struck down one of the then key legal mechanisms (“Safe Harbor”) for doing so. And in 2020 the CJEU did so with its successor, “Privacy Shield”. Both cases were initiated by complaints by lawyer and activist Max Schrems, and focused on the transfer of data from the EU to the US by Facebook.

Put simply, European data protection law, in the form of the GDPR and (as we must now talk about the UK in separate terms) UK data protection law, in the form of UKGDPR, outlaw the transfer of personal data to the US (or any other third country), unless the level of protection the data would receive in the EU, or the UK, is “not undermined” (see Chapter V of and recital 101 of GDPR/UKGDPR).

In “Schrems II” – the 2020 case – the CJEU not only struck down Privacy Shield – it effectively also laid down rules which needed to be followed if the alternative mechanisms, for instance using “standard contractual clauses” were to be used for transfers of personal data. Following the judgment, the European Data Protection Board (EDPB) issued guidance in the form of FAQs, which recommended an “assessment, taking into account the circumstances of the transfers, and supplementary measures you could put in place”. The EDPB guidance was subsequently endorsed by the UK’s own Information Commissioner’s Office (ICO)

The EDPB has recommended that you must conduct a risk assessment as to whether SCCs provide enough protection within the local legal framework, whether the transfer is to the US or elsewhere

What struck me as odd in all this is that the ICO themselves have a Facebook page. Given that Facebook’s own data governance arrangements involve the transfer of EU and UK users’ data to the US, and given that ICO don’t just operate their page as a newsletter, but actively encourage users to comment and interact on their page, it seemed to me that ICO were enabling the transfer of personal data by Facebook to the US. But even further than that, another CJEU judgment has previously made clear that operators of corporate Facebook pages may well function as a controller under the GDPR/UKGDPR, where they set parameters on the page. The Wirtschaftsakademie case held that – in the case of someone operating a “fan page”

While the mere fact of making use of a social network such as Facebook does not make a Facebook user a controller jointly responsible for the processing of personal data by that network, it must be stated, on the other hand, that the administrator of a fan page hosted on Facebook, by creating such a page, gives Facebook the opportunity to place cookies on the computer or other device of a person visiting its fan page, whether or not that person has a Facebook account.

By extension, it seemed to me, the ICO were in this position with their page.

So I put the point to them. After four months, and some chasing, I received a reply which not only confirmed my understanding that they are, and accept that they are, a controller, but that, nearly a year on from the Schrems II decision, they have not finished reviewing their position and have not updated their privacy notice to reflect their controller status in respect of their Facebook processing. (They also say that their legal basis for processing is “Article 6 (1) (e) of UK GDPR, public task” because “as a regulator we have a responsibility to promote good practice and engage with the public at large about data protection issues via commonly used platforms”, but I’d observe that they fail to give any attention to the proportionality test that reliance on this condition requires, and fail to point to the justification in domestic law, as required by Article 6.)

What the ICO response doesn’t do is actually respond to me as a data subject in respect of my complaint nor explain how they are complying with the international data transfer provisions of Chapter V of the GDPR/UKGDPR, and whether they have conducted any sort of transfer impact assessment (one presumes not).

As I said in my original complaint to ICO, I am aware that I might be seen as being mischievous, and I’m also aware I might be seen as having walked ICO into a trap. Maybe I am, and maybe I have, but there’s also a very serious point to be made. The cost to UK business of the Schrems II decision has been enormous, in terms of the legal advice sought, the internal governance reviews and risk assessments undertaken, and the negotiating or novation of contracts. At the same time the business and legal uncertainty is significant, with many wondering about their exposure to legal claims but also (and especially) to regulatory enforcement. If, though, the regulator is not complying with the relevant law, ten months on from the judgment (and five months on from my raising it with them as a concern) then what are controllers meant to do? And where do they turn to for guidance on the regulatory approach?

THE ICO RESPONSE

Firstly, it may be helpful to explain that following the findings of the CJEU in Wirtschaftsakademie, we started a review of the transparency information we provide to visitors of the page. The review was delayed when Schrems11 decision was issued as we needed to consider the impact of the judgement on any transfer element to the US.

We agree that as the Facebook page administrator, we are processing personal data of the visitors of our page and therefore we are controllers for this information. We process the names of the users as they appear on their Facebook profiles and any personal data they may share through their comments on our posts or via messages to us. We process this information in reliance on Article 6 (1) (e) of UK GDPR, public task. We consider that, as a regulator we have a responsibility to promote good practice and engage with the public at large about data protection issues via commonly used platforms.

For the cookies and similar technologies, Facebook is responsible for setting the cookies, when you visit our Facebook page.

We also receive anonymous information from Facebook in the form of aggregate statistics of all those who visit our page, regardless of whether they have a Facebook account or not. In line with the findings of the CJEU in Wirtschaftsakademie we are joint controllers with Facebook for this information. We process this information under Article 6 (1) (e) as well. The Insights include information on page viewings, likes, sharing of posts, age range, the device used and how it was accessed and breakdown of demographics. All Insights are received from Facebook by the ICO in aggregate format. Our PN will updated shortly to reflect the above information.

Like other regulators, the ICO is currently reviewing its position on international transfers following the judgment in Schrems II. As part of that review, it will, amongst other things, consider the questions that you have raised about the ICO’s use of Facebook. The ICO intends to publish its guidance on how UK organisations should address the question of international transfers, in due course, and will act in accordance with its guidance. That work is still in progress, and it will be published in due course.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adequacy, data sharing, EDPB, facebook, GDPR, Information Commissioner, international transfers, privacy notice, privacy shield, safe harbor, Schrems II, UK GDPR

An Uber-reaction in The Times

“Uber gives police private data on drivers and passengers” announces The Times(£) this morning.

In this post, much to my surprise (I have never taken an Uber, and don’t intend to – I don’t like their business model), I come to the defence of Uber.

A closer read of the Times piece reveals that what is being referred to, in documents filed with the High Court, in proceedings regarding TfL’s refusal to renew Uber’s licence, is requests to Uber from the police to disclose personal data for the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders.

Such requests are commonly made to thousands of public authorities and private companies. They used to be known in data protection and police circles as “section 29 requests”, after the relevant section of the now-repealed Data Protection Act 1998. The term was a bit misleading: section 29, now replaced effectively by paragraph 2 of Schedule 2 to the Data Protection Act 2018, has the effect of disapplying the provisions of data protection law which would otherwise prevent the disclosure of personal data to the police (or others), and where not disclosing would be likely to prejudice the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders. This is a necessary provision of data protection law, and provided that (as with all provisions) it is applied correctly and proportionately, it works very well: it gives controller the power to disclose personal data to the police where it is necessary for criminal justice.

If Uber are dealing with police requests appropriately, it is for the public good that personal data which assists the police to investigate drug transporting and human trafficking is made available to them.

In fact, I strongly suspect that The Times will receive such requests from the police. When the requests are related to the paper’s journalistic activities they are probably, and probably rightfully, refused, but they may well get requests in respect of their employees’ data, and I would be very surprised if they don’t sometimes – as a responsible company – comply with these.

Transport for London certainly receives such requests. Indeed, as a public authority, under its transparency measures, it has habitually made statistics on this public. The most recent publication I can find shows that 2012 to 2017 TfL received an average of approximately 10,000 requests each year.

Will The Times now report that TfL is handing over to the police thousands of pieces of intelligence on members of the public each year?

Leave a comment

Filed under Data Protection, Data Protection Act 2018, data sharing, police