Tag Archives: GDPR

Oral disclosure of personal data: a new domestic case

“Pretexting” and “blagging” are forms of social engineering whereby someone attempts to extract information from a source by deception. One (unethical) example is when a journalist purports to be someone else in order to gather information for a story.

A recent misuse of private information and data protection judgment in the High Court deals with a different, and sadly not uncommon, example – where an estranged, abusive partner convinced a third party to give information about their partner so they can continue their harassment of them.

The claimant had worked at a JD Wetherspoon pub, but had left a few months previously. She had given her contact details, including her mother’s mobile phone number, to her manager, and the details were kept in a paper file, marked “Strictly Private and Confidential”, in a locked filing cabinet. During the time she was employed she had been the victim of offences by a former partner of serious violence and harassment which involved subjecting her to many unwanted phone calls. He was ultimately convicted of these and sentenced to 2 ½ years in prison. Her employer was aware of the claimant’s concerns about him.

While her abuser was on remand, he rang the pub, pretending to be a police officer who needed to contact the claimant urgently. Although the pub chain had guidance on pretexting, under which such attempts to acquire information should be declined initially and referred to head office, the pub gave out the claimant’s mother’s number to the abuser, who then managed to speak to (and verbally abuse) the claimant, causing understandable distress.

She brought claims in the county court in misuse of private information, breach of confidence and for breach of data protection law. She succeeded at first instance with the first two, but not with the data protection claim. Wetherspoons appealed and she cross-challenged, not by appeal but by way of a respondent’s notice, the rejection of the data protection claim.

In a well-reasoned judgment in Raine v JD Wetherspoon PLC [2025] EWHC 1593 (KB), Mr Justice Bright dismissed the defendant’s appeals. He rejected their argument that the Claimant’s mother’s mobile phone number did not constitute the Claimant’s information or alternatively that it was not information in which she had a reasonable expectation of privacy: it was not ownership of the mobile phone that mattered, nor ownership of the account relating to it – what was relevant was information: the knowledge of the relevant digits. As between the claimant and the defendant, that was the claimant’s information, which was undoubtedly private when given to the defendants and was intended to remain private, rather than being published to others.

The defendant then argued that there can be no cause of action for misuse of private information if the Claimant is unable to establish a claim under the DPA/GDPR, and, relatedly, that a data security duty could not arise under the scope of the tortious cause of action of misuse of private information. In all honesty I struggle to understand this argument, at least as articulated in the judgment, probably because, as the judge suggests, this was not a data security case involving failure to take measures to secure the information. Rather, it involved a positive act of misuse: the positive disclosure of the information by the defendant to the abuser.

The broadly similar appeal grounds in relation to breach of confidence failed, for broadly similar reasons.

The counter challenge to the prior dismissal of the data protection claim, by contrast, succeeded. At first instance, the recorder had accepted the defendant’s argument that this was a case of purely oral disclosure of information, and that, applying Scott v LGBT Foundation Limited, this was not “processing” of “personal data”. However, as the judge found, in Scott,

the information had only ever been provided to the defendant orally; and…then retained not in electronic or manual form in a filing system, but only in the memory of the individual who had received the original oral disclosure…In that case, there was no record, and no processing. Here, there was a record of the relevant information, and it was processed: the personnel file was accessed by [the defendant’s employee], the relevant information was extracted by her and provided in written form to [another employee], for him to communicate to [the abuser].

This fell “squarely within the definition of ‘processing’ in the GDPR at article 4(2)”. Furthermore, there was judicial authority in Holyoake v Candy that, in some circumstances, oral disclosure will constitute processing (a view supported by the European Court in Endemol Shine Finland Oy).

Damages for personal injury, in the form of exacerbation of existing psychological damage, of £4500 were upheld.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Breach of confidence, Data Protection, data sharing, GDPR, judgments, misuse of private information, Oral disclosure

Consent is not the only basis

In 2017 I attended a free event run by a “GDPR consultancy”. The presenter confidently told us that we were going to have to get consent from customers in order to process their personal data. One attendee said they worked at the DWP, so how were they going to get consent from benefits claimants who didn’t want to disclose their income, to which the presenter rather awkwardly said “I think that’s one you’ll have to discuss with your lawyers”. Another attendee, who was now most irritated that he’d taken time out from work for this, could hold his thoughts in no longer, and rudely announced that this was complete nonsense.

That attendee was the – much ruder in those days – 2017 version of me.

I never imagined (although I probably should have done) that eight years on the same nonsense would still be spouted.

Just as the Data Protection Act 2018 did not implement the GDPR in the UK (despite the embarrassing government page that until recently, despite people raising it countless times, said so) just as the GDPR does not limit its protections to “EU citizens”, so GDPR and the UK GDPR do not require consent for all processing.

Anyone who says so has not applied a smidgeon of thought or research to the question, and is probably taking content from generative AI, which, on the time-honoured principle of garbage-in, garbage-out, has been in part trained on the existing nonsense. To realise why it’s garbage, they should just start with the DWP example above and work outwards from there.

Consent is one of the six lawful bases, any one or more of which can justify processing. No one basis is better than or takes precedence over the other.

To those who know this, I apologise for having to write it down, but I want to have a sign to tap for any time I see someone amplifying the garbage on LinkedIn.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, DWP, GDPR, Let's Blame Data Protection, UK GDPR

Cookies, compliance and individuated consent

[reposted from my LinkedIn account]

Much will be written about the recent High Court judgment on cookies, direct marketing and consent, in RTM v Bonne Terre & Anor, but treat it all (including, of course, this, with caution).

This was a damages claim by a person with a gambling disorder. The claim was, in terms, that the defendant’s tracking of his online activities, and associated serving of direct marketing, were unlawful, because they lacked his operative consent, and they led to damage because they caused him to gamble well beyond his means. The judgment was only on liability, and at the time of writing this post there has been no ruling on remedy, or quantum of damages.

The domestic courts are not regulators – they decide individual cases, and where a damages claim is made by an individual any judicial analysis is likely to be highly fact specific. That is certainly the case here, and paragraphs 179-181 are key:

such points of criticism as can be made of [the defendant’s] privacy policies and consenting mechanisms…are not made wholesale or in a vacuum. Nor are they concerned with any broader question about best practice at the time, nor with the wisdom of relying on this evidential base in general for the presence of the consents in turn relied on for the lawfulness of the processing undertaken. Such general matters are the proper domain of the regulators.

In this case, the defendant could not defeat a challenge that in the case of this claimant its policies and consenting mechanisms were insufficient:

If challenged by an individual data subject, a data controller has to be able to demonstrate the consenting it relies on in a particular case. And if that challenge is put in front of a court, a court must decide on the balance of probabilities, and within the full factual matrix placed before it, whether the data controller had a lawful consent basis for processing the data in question or not.

Does this mean that a controller has to get some sort of separate, individuated consent for every data subject? Of course not: but that does not mean that a controller whose policies and consenting mechanisms are adequate in the vast majority of cases is fully insulated from a specific challenge from someone who could not give operative consent:

In the overwhelming majority of cases – perhaps nearly always – a data controller providing careful consenting mechanisms and good quality, accessible, privacy information will not face a consent challenge. Such data controllers will have equipped almost all of their data subjects to make autonomous decisions about the consents they give and to take such control as they wish of their personal data…But all of that is consistent with an ineradicable minimum of cases where the best processes and the most robust evidential provisions do not, in fact, establish the necessary presence of autonomous decision-making, because there is specific evidence to the contrary.

This is, one feels, correct as a matter of law, but it is hardly a happy situation for those tasked with assessing legal risk.

And the judgment should (but of course won’t) silence those who promise, or announce, “full compliance” with data protection and electronic marketing law.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adtech, consent, cookies, Data Protection, GDPR, judgments, marketing, PECR, Uncategorized

Crowdstrike and personal data breaches: loss vs unavailability

I ran a poll on LinkedIn in recent days which asked “If a controller temporarily can’t access personal data on its systems because of the Crowdstrike/MSFT incident is it a personal data breach?” 

I worded the question carefully.

50% of the 100-odd people who voted said “no” and 50% said “yes”. The latter group are wrong. I say this with some trepidation because there are people in that group whose opinion I greatly respect. 

But here’s why they, and, indeed, the Information Commissioner’s Office and the European Data Protection Board, are wrong.

Article 4(12) of the GDPR/UK GDPR defines a “personal data breach”. This means that it is a thing in itself. And that is why I try always to use the full term, or abbreviate it, as I will here, to “PDB”. 

This is about the law, and in law, words are important. To refer to a PDB as the single word “breach” is a potential cause of confusion, and both the ICO and the EDPB guidance are infected by and diminished by sloppy conflation of the terms “personal data breach” and “breach”. In English, at least, and in English law, the word “breach” will often be used to refer to a contravention of a legal obligation: a “breach of the law”. (And in information security terminology, a “breach” is generally used to refer to any sort of security breach.) But a “breach” is not coterminous with a “personal data breach”.

And a PDB is not a breach of the law: it is a neutral thing. It is also crucial to note that nowhere do the GDPR/UK GDPR say that there is an obligation on a person (whether controller or processor) not to experience a PDB, and nowhere do GDPR/UK GDPR create liability for failing to prevent one occurring. This does not mean that where a PDB has occurred because of an infringement of other provisions which do create obligations and do confer liability (primarily Article 5(1)(f) and Article 32) there is no potential liability. But not every PDB arises from an infringement of those provisions.

The Article 4(12) definition is “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed”. Let us break that down:

  • A breach of security…
  • leading to [one or more of]
  • accidental or unlawful…
  • 1. destruction of…
  • 2. loss of…
  • 3. alteration of…
  • 4. unauthorised disclosure of…
  • 5. unauthorised access to…
  • personal data processed.

If an incident is not a breach of security, then it’s not a PDB. And if it is a breach of security but doesn’t involve personal data, it’s not a PDB. But even if it is a breach of security, and involves personal data, it’s only a PDB if one of the eventualities I’ve numbered 1 to 5 occurs.

Note that nowhere in 1 to 5 is there “unavailability of…” or “loss of access to…”. 

Now, both the ICO, and the EDPB, read into the words “loss of…personal data…” the meaning, or potential meaning “loss of availability of personal data”. But in both cases they appear to do so in the context of saying, in terms, “loss of availability is Article 4(12) ‘loss’ because it can cause harm to data subjects”. I don’t dispute, and nor will many millions of people affected by the Crowdstrike incident, that unavailability of personal data can cause harm. But to me, “loss” means loss: I had something, and I no longer have it. I believe that that is how a judge in the England and Wales courts would read the plain words of Article 4(12), and decide that if the legislator had intended “loss” to mean something more than the plain meaning of “loss” – so that it included a meaning of “temporary lack of access to” – then the legislator would have said so. 

Quite frankly, I believe the ICO and EDPB guidance are reading into the plain wording of the law a meaning which they would like to see, and they are straining that plain wording beyond what is permissible.

The reason, of course, that this has some importance is that Article 33 of the GDPR/UK GDPR provides that “in the case of” (note the neutral, “passive” language) a PDB, a controller must in general make a notification to the supervisory authority (which, in the UK, is the ICO), and Article 34 provides that where a PDB is likely to result in a high risk to the rights and freedoms of natural persons, those persons should be notified. If a PDB has not occurred, no obligation to make such notifications arises. That does not mean of course, that notifications cannot be made, through an exercise of discretion (let’s forget for the time being – because they silently resiled from the point – that the ICO once bizarrely and cruelly suggested that unnecessary Article 33 notifications might be a contravention of the GDPR accountability principle.)

It might well be that the actions or omissions leading to a PDB would constitute an infringement of Articles 5(1)(f) and 32, but if an incident does not meet the definition in Article 4(12), then it’s not a PDB, and no notification obligation arises. (Note that this is an analysis of the position under the GDPR/UK GDPR – I am not dealing with whether notification obligations to any other regulator arise.)

I can’t pretend I’m wholly comfortable saying to 50% of the data protection community, and to the ICO and EDPB, that they’re wrong on this point, but I’m comfortable that I have a good arguable position, and that it’s one that a judge would, on balance agree with. 

If I’m right, maybe the legislator of the GDPR/UK GDPR missed something, and maybe availability issues should be contained within the Article 4(12) definition. If so, there’s nothing to stop both the UK and the EU legislators amending Article 4(12) accordingly. And if I’m wrong, there’s nothing to stop them amending it to make it more clear. In the UK, in particular, with a new, energised government, a new Minister for Data Protection, and a legislative agenda that will include bills dealing with data issues, this would be relatively straightforward. Let’s see.

And I would not criticise any controller which decided it was appropriate to make an Article 33 notification. It might, on balance, be the prudent thing for some affected controllers to do so. The 50/50 split on my poll indicates the level of uncertainty on the part of the profession. One also suspects that the ICO and the EU supervisory authorities might get a lot of precautionary notifications.

Heck, I’ll say it – if anyone wants to instruct me and my firm to advise, both on law and on legal strategy – we would of course be delighted to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, EDPB, GDPR, Information Commissioner, Let's Blame Data Protection, LinkedIn Post, personal data breach, UK GDPR

Disastrous data protection advice in child protection proceedings

I am only going to link at the foot of this post to the recent judgment in the Family Court, as it is long, contains distressing and graphic references to alleged sexual offences and how a school and a local authority dealt with the allegations and only deals in passing with the issue I raise in this post. Please be aware of that.

However, the issue is of real importance.

The reason for referring to it is the extraordinary, and extraordinarily worrying, references in the judgment to a discussion a deputy head teacher had with the nine year old child in question. The judgment records the teacher’s evidence that, although

she took notes of the discussion she destroyed any notes that she had made. This appeared to be in accordance with a school-wide misunderstanding of data protection guidance. She fairly admitted that after a year she could only guess at those notes now

The judge stresses that she

“[does] not criticise GG – she was a caring and conscientious teacher who was doing her best and believed she was following advice and good practice. She lacked specialist training and some of the advice was unhelpful. I have carefully considered the problems with her record of this discussion, and I am mindful that these challenges add to the difficulty of appraising the reliability of what she recorded.”

[nb, this was said not solely in the context of the destruction of the notes]

The London Borough involved recognised, during the course of the proceedings, “the importance of addressing a wide range of gaps and concerns that emerged during the course of this hearing”, and the judge invited the parties to draw up an agreed list of issues for the Council to consider and provide a response to as a positive problem-solving exercise. Among these agreed issues was this

“Contemporaneous notes need to be taken when a child makes any allegation of physical, sexual or emotional abuse against a third party…. It needs to be made clear within the policy that contemporaneous notes ought to be kept and stored securely (electronically if possible). This includes any handwritten notes even if, only key words are noted down and later entered onto any electronic system. THIS DOES NOT INFRINGE GDPR.”

Those final words resound, even if they shouldn’t need saying.

Prior to GDPR, there were certainly a multitude of misunderstandings about data protection, but the idea that personal data should not be recorded, or should be quickly destroyed, is one of the most pernicious of misunderstandings that seems to have emerged since GDPR – in part from terrible advice and training given by people who shouldn’t have ever been engaged to train the public sector. I implore those involved in training and advising in these complex areas of social care and education to consider the import and impact of the advice they give.

Finally, the importance and meaning of the first word of the third data protection principle is often overlooked. Yes, it’s the “data minimisation” principle, but personal data must still be adequate.

This is the judgment.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, GDPR, local government, retention, UK GDPR

UK GDPR amended

Three years ago, at the end of the Brexit Implementation Period, I helped prepare a version of the UK GDPR for the Mishcon de Reya website. At the time, it was difficult to find a consolidated version of the instrument, and the idea was to offer a user-friendly version showing the changes made to the retained version of the GDPR, as modified by the Data Protection, Privacy and Electronic Communications (Amendments Etc.) (EU Exit) Regulations 2019, and the Data Protection, Privacy and Electronic Communications (Amendments Etc.) (EU Exit) Regulations 2020.

Since then, the main legislation.gov.uk has offered a version. However, with respect to that site, it’s not always the easiest to use.

The burden now, though, falls to me and Mishcon, of updating our pages as and when the UK GDPR itself gets amended. Major changes are likely to made when the Data Protection and Digital Information Bill gets enacted, but, first, we have the minor amendments (minor in number, of not in significance) effected by The Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 (which came into force at 23:59:59 on 31.12.23).

The changes have been made to Articles 1, 4, 9, 50, 85 and 86.

The Mishcon pages have been very well used, and we’ve had some great feedback on them. They don’t profess to be an authoritative version (and certainly should not be relied on as such) but we hope they’ll continue to be a useful resource.

Leave a comment

Filed under Data Protection, GDPR, UK GDPR

I was stupid

I was stupid, I was naive: I thought that recent statements from senior people at the Information Commissioner’s Office (ICO) indicated a willingness to enforce against non-compliance in the use of cookies and cookie banners.

I was wrong. My recent complaint, published as an open letter to John Edwards, the Commissioner, not only took ten weeks to be allocated to a case worker, but, now, that case worker has told me, in terms, that they’re not interested:

we do not respond to cookie complaints individually…Our approach is to focus on sites that are doing nothing to raise awareness of cookies, or get their users’ consent, particularly those visited most in the UK. When consumers raise their complaints with us, we either conduct our own compliance check or write to the organisation…Our approach is to focus on sites that are doing nothing to raise awareness of cookies, or get their users’ consent, particularly those visited most in the UK.

This leaves two things hanging: 1) the site I complained about is one of the most visited in the UK; 2) the website in question arguably “raises awareness” of cookies, but only insofar as it confounds, frustrates and obstructs the user, in a manner which, in my submission, contravenes ePrivacy and Data Protection law, and 3) fails to get users’ consent (as it is defined in those laws).

MLex(£) have now written about this, and have secured a quote from the ICO, which is more than I got, really:

It is an ICO priority to influence changes to online tracking practices to create a more privacy-oriented internet. Where users want personalized adverts they should have the choice to receive them. But where websites don’t give people fair choices over how their data is used we will take action to safeguard their rights.

Try as I might, I can’t square that, and the ICO’s previous public statements about taking firm action, with an approach which fails in any real way to engage with people who take the time and effort to make complaints. But, as I say, I was stupid and naive to think it might have been different.

I’ve now complained, in turn, about the ICO’s handling of my complaint (and made an FOI request), in these terms:

1. I made a complaint under Article 77 UK GDPR. You have not investigated that at all, let alone “to the extent appropriate” as you are required to do under Article 57(1)(f). 

2. My letter was addressed to John Edwards. Has he seen it? 

3. You say, “When consumers raise their complaints with us, we either conduct our own compliance check or write to the organisation.” Which have you done here? Please disclose information either in respect of the compliance check you undertook, or of the correspondence you sent to Associated Newspapers Ltd.

4. Frankly, your response is discourteous. I went to some effort to assist the ICO in its stated intention to investigate poor compliance with PECR, but your response gives no indication that you’ve even read the substance of my complaint.

5. Your letter contains no apology or explanation for the extensive delay in handling it, which falls outside your own service standards.

In seriousness, I find this all really disheartening. The gulf between what the ICO says and what it does is sometimes huge, and not necessarily appreciated by those who don’t work in the field.

But I will get back in my stupid box.

+++

For completeness’ sake, the full response from the caseworker was:

Thank you for your correspondence in which you have complained about Associated Newspapers Ltd and its use of cookies.

Complaints regarding cookies can be submitted to us through the following link: Cookies | ICO

In this case, I have forwarded the information you have provided to the appropriate department. Although we do not respond to cookie complaints individually, we use the information you send us to help us identify, investigate and take action against organisations causing you complaint. To do this, we work alongside other organisations and website owners.

Our approach is to focus on sites that are doing nothing to raise awareness of cookies, or get their users’ consent, particularly those visited most in the UK. When consumers raise their complaints with us,
we either conduct our own compliance check or write to the organisation. Our website provides further information about the action we’re taking on cookies.

Yours sincerely

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

6 Comments

Filed under adtech, consent, cookies, Information Commissioner, PECR, UK GDPR

Has the Information Commissioner’s Office lost its FOI purposes?

When Parliament passed the Data Protection Act 1984 it created a role of a regulator for that new data protection law. Section 3(1)(a) said that

For the purposes of this Act there shall be…an officer known as the Data Protection Registrar

The office remained in this form until the passing of the Data Protection Act 1998, section 6(1) of which provided that

The office originally established by section 3(1)(a) of the Data Protection Act 1984 as the office of Data Protection Registrar shall continue to exist for the purposes of this Act but shall be known as the office of Data Protection Commissioner

The advent of the Freedom of Information Act 2000 necessitated a change, so as to create a role of regulator for that Act. Paragraph 13(2) of Schedule 2 to the Freedom of Information Act 2000 amended section 6(1) of the Data Protection Act 1998 so it read

For the purposes of this Act and of the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner

So, at this point, and indeed, until 25 May 2018, there was an Information Commissioner “for the purposes of” the Data Protection Act 1998, and “for the purposes of” the Freedom of Information Act 2000.

25 May 2018 marked, of course the date from which (by effect of its Article 99) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, or “GDPR“, applied.

Also on 25 May 2018, by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018, section 114 of the Data Protection Act 2018 commenced. This provided (and provides)

There is to continue to be an Information Commissioner.

However, paragraph 44 of schedule 19 to the Data Protection Act 2018 (commenced also by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018) repealed the “FOIA purpose” provisions of section 6(1) of the Data Protection Act 1998 (which, to recall, said that “for the purposes of…the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner“). At the same time, paragraph 59 of schedule 19 to the Data Protection Act 2018 repealed section 18(1) (which had provided that “The Data Protection Commissioner shall be known instead as the Information Commissioner“).

So, the Information Commissioner is no longer described, in statute, as an officer which shall be for the purposes of the Freedom of Information Act 2000.

Probably nothing turns on this. Elsewhere in the Freedom of Information Act 2000 it is clear that the Information Commissioner has various functions, powers and duties, which are not removed by the repeal (and subsequent absence of) the “FOIA purpose” provisions. However, the repeal (and absence) do raise some interesting questions. If Parliament thought it right previously to say that, for the purposes of the Freedom of Information Act 2000 there should have been an Information Commissioner, why does it now think it right not to? No such questions arise when it comes to the data protection laws, because section 114 and schedule 12 of the Data Protection Act 2018, and Articles 57 and 58 of the UK GDPR, clearly define the purposes (for those laws) of the Information Commissioner.

Maybe all of this rather painful crashing through the thickets of the information rights laws is just an excuse for me to build up to a punchline of “what’s the purpose of the Information Commissioner?” But I don’t think that is solely what I’m getting at: the implied uncoupling of the office from its purposes seems odd, and something that could easily have been avoided (or could easily be remedied). If I’m wrong, or am missing something – and I very much invite comment and correction – then I’ll happily withdraw/update this post.

Please note that links to statutes here on the legislation.gov.uk website are generally to versions as they were originally enacted.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, Freedom of Information, GDPR, Information Commissioner

Has ICO “no fines” policy been introduced without proper debate?

At the NADPO annual conference last year Information Commissioner John Edwards discussed his policy of reserving fines under UK GDPR to public bodies only for the most egregious cases. The policy had been announced a few months earlier in an open letter (interestingly addressed to “public sector colleagues”).

Since then, it seems that fines (other than for Privacy and Electronic Communications Regulations (PECR) matters) are – in general – almost off the Information Commissioner’s agenda. Just this week a reprimand – only – was issued to a video sharing platform (the contents of which tend towards the conspiratorial, and the users of which might have particular concerns about exposure) which suffered an exfiltration attack involving 345000 user names, email addresses and passwords.

Earlier this year I made a Freedom of Information request for the evidential basis for Edwards’ policy. The response placed primary focus on a paper entitled “An Introduction to Outcome Based Cooperative Regulation (OBCR)” by Christopher Hodges, from the Centre for Socio-Legal Studies at Oxford. Hodges is also Chair of the government’s Regulatory Horizons Council.

The paper does not present empirical evidence of the effects of fines (or the effects of not-fining) but proposes a staged model (OBCR) of cooperation between businesses (not, one notes, public bodies) and regulators to achieve common purposes and outcomes. OBCR, it says, enables organisations to “opt for basing their activities around demonstrating they can be trusted”. The stages proposed involve agreement amongst all stakeholders of purposes, objectives and desired outcomes, as well as evidence and metrics to identify those outcomes.

But what was notable about Edwards’ policy, was that it arrived without fanfare, and – apparently – without consultation or indeed any involvement of stakeholders. If the aim of OBCR is cooperation, one might reasonably question whether such a failure to consult vitiates, or at least hobbles, the policy from the start.

And, to the extent that the judiciary is one of those stakeholders, it would appear from the judgment of Upper Tribunal Judge Mitchell, in the first GDPR/UK GDPR fining case (concerning the very first GDPR fine in the UK) to reach the appellate courts, that there is not a consensus on the lack of utility of fines. At paragraph 178, when discussing the fact that fines (which are, by section 155 Data Protection Act 2018, “penalty” notices) the judge says

There is clearly also a dissuasive aspect to [monetary penalty notices]. I do not think it can be sensibly disputed that, in general, the prospect of significant financial penalties for breach of data protection requirements makes a controller or processor more likely to eschew a lackadaisical approach to data protection compliance and less likely to take deliberate action in breach of data protection requirements.

This is a statement which should carry some weight, and, to the extent that it is an expression on regulatory theory (which I think it is) it illustrates why a policy such as John Edwards has adopted requires (indeed, required) more of a public debate that it appears to have had.

As the issuing of fines inevitably involves an exercise of discretion, it is essentially impossible to say how many fines have not been issued which would have been, but for the Edwards policy (although it might be possible to look at whether there has – which I suspect there has – been a corresponding increase in “reprimands”, and draw conclusions from that). Nonetheless, some recipients of fines from before the policy was introduced might well reasonably ask themselves whether, had Edwards’ policy been in place at the time, they would have escaped the penalty, and why, through an accident of timing, they were financially punished when others are not. Similarly, those companies which may still receive fines, including under the PECR regime, yet which can convincingly argue that they wish to, and can, demonstrate they can be trusted, might also reasonably asked why they are not being given the opportunity to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, fines, GDPR, Information Commissioner, monetary penalty notice, PECR, rule of law, UK GDPR

ICO guidance on domestic CCTV – more hindrance than help

An article in the Mail on the use of connected doorbells has led me again to one of the oddest pages on the ICO’s website, on the use of domestic CCTV. Odd, because (behoven to the outdated, and frankly somewhat silly, decision of the CJEU in the 2014 Ryneš case) it approaches the issue on the basis that if a camera captures footage outside the curtilage of one’s home, then the home owner cannot avail themselves of the carve-out from the UK GDPR (at Article 2(2)) for “processing of personal data by an individual in the course of a purely personal or household activity”. But the law says nothing at all about the location or visual range of cameras – it is all about the processing purposes.

Also odd is that the ICO goes on to say that people operating CCTV that captures footage beyond their home’s curtilage will be required to comply with data subject rights (such as providing a privacy notice, and responding to access/erasure/stop requests). But, says the ICO, “we probably won’t do anything if people ignore us”:

You can complain to us when a user of domestic CCTV doesn’t follow the rules. We can send a letter asking them to resolve things, eg put up the appropriate signage or respond to data protection requests. 

There is a limited amount of action the ICO can take after this point to make the person comply. It is highly unlikely the ICO will consider it fair or balanced to take enforcement action against a domestic CCTV user.

But oddest of all, the ICO says:

“These rules only apply to fixed cameras. They do not cover roaming cameras, such as drones or dashboard cameras (dashcams) as long as the drone or dashcam is used only for your domestic or household purposes”

I simply don’t understand this distinction between fixed cameras and “roaming” cameras, despite the fact that the ICO states that “data protection law” says this. I’m unaware of any law that provides a basis for the assertion (if anyone knows, please let me know). I would, in fact, be prepared to mount an argument that “roaming” cameras are more, or have the potential to be more, intrusive on others’ rights than fixed cameras.

The Article 2(2) “purely personal or household activity” carve-out is a complex provision, and one that has got the ICO into choppy waters in the past (see the trenchant criticism of Tugendhat J in the “Solicitors from Hell” litigation, at paras 93-101, which considered the similar carve-out under the prior law). There are some very interesting questions and arguments to be considered (especially when the gloss provided by recital 18 is taken into account, with its reference to online personal or household activities also being outwith the material scope of the law). However, the ICO’s guidance here will likely serve only to confuse most householders, and – I suspect – has the potential in some cases to escalate private disputes.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under CCTV, GDPR, Information Commissioner, material scope, privacy notice, surveillance, UK GDPR