ICO failing to inform complainants of investigation outcomes

I’d like you to imagine two people (Person A and Person B). Both receive an unsolicited direct marketing call to their personal mobile phone, in which the caller says the recipient’s name (e.g. “am I speaking to Jon Baines?”) Both are registered with the Telephone Preference Service. Both are aggrieved at receiving the unlawful call.

Person A knows nothing much about electronic marketing laws, and nothing much about data protection law. But, to them, quite reasonably, the call would seem to offend their data protection rights (the caller has their name, and their number). They do know that the Information Commissioner enforces the data protection laws.

Person B knows a lot about electronic marketing and data protection law. They know that the unsolicited direct marketing call was not just an infringement of the Privacy and Electronic Communications (EC Directive) Regulations 2003, but also involved the processing of their personal data, thus engaging the UK GDPR.

Both decide to complain to the Information Commissioner’s Office (ICO). Both see this page on the ICO website

 

They see a page for reporting Nuisance calls and messages, and, so, fill in the form on that page.

And never hear anything more.

Why? Because, as the subsequent page says “We will use the information you provide to help us investigate and take action against those responsible. We don’t respond to complaints individually” (emphasis added).

But isn’t this a problem? If Person A’s and Person B’s complaints are (as they seem to be) “hybrid” PECR and UK GDPR complaints, then Article 57(1)(f) of the latter requires the ICO to

handle complaints lodged by a data subject…and investigate, to the extent appropriate, the subject matter of the complaint and inform the complainant of the progress and the outcome of the investigation within a reasonable period (emphasis added)

What Article 57(1)(f) and the words “investigate, to the extent appropriate” mean, has been the subject of quite a bit of litigation in recent years (the basic summary of which is that the ICO has broad discretion as to how to investigate, and even a mere decision to cease handling a complaint will be likely to suffice (see Killock & Veale & others v Information Commissioner (GI/113/2021 & others)).

But nowhere has anyone suggested that ICO can simply decide not to “inform the complainant of the progress and the outcome of the investigation”, in hybrid complaints like the Person A’s and Person B’s would be.

Yet that is what undoubtedly happens in many cases. And – it strikes me – it has happened to me countless times (I have complained about many, many unsolicited calls over the years, but never heard anything of the progress and outcome). Maybe you might say that I (who, after all, have found time to think about and write this post) can’t play the innocent. But I strongly believe that there are lots of Person As (and a fair few Person Bs) who would, if they knew that – to the extent theirs is a UK GDPR complaint –  the law obliges the ICO to investigate and inform them of the progress and the outcome of that investigation, rightly feel aggrieved to have heard nothing.

This isn’t just academic: unsolicited direct marketing is the one area that the ICO still sees as worthy of fines (all but two of the twenty-three fines in the last year have been under that regime). So a complaint about such a practice is potentially a serious matter. Sometimes, a single complaint about such marketing has resulted in a large fine for the miscreant, yet – to the extent that the issue is also a UK GDPR one – the complainant themselves often never hears directly about the complaint.

In addition to the Killock & Veale case, there have been a number of cases looking at the limits to (and discretion regarding) ICO’s investigation of complaints. As far as I know no one has actually yet raised what seems to be a plain failure to investigate and inform in these “hybrid” PECR and UK GDPR cases.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, PECR, UK GDPR

Has the Information Commissioner’s Office lost its FOI purposes?

When Parliament passed the Data Protection Act 1984 it created a role of a regulator for that new data protection law. Section 3(1)(a) said that

For the purposes of this Act there shall be…an officer known as the Data Protection Registrar

The office remained in this form until the passing of the Data Protection Act 1998, section 6(1) of which provided that

The office originally established by section 3(1)(a) of the Data Protection Act 1984 as the office of Data Protection Registrar shall continue to exist for the purposes of this Act but shall be known as the office of Data Protection Commissioner

The advent of the Freedom of Information Act 2000 necessitated a change, so as to create a role of regulator for that Act. Paragraph 13(2) of Schedule 2 to the Freedom of Information Act 2000 amended section 6(1) of the Data Protection Act 1998 so it read

For the purposes of this Act and of the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner

So, at this point, and indeed, until 25 May 2018, there was an Information Commissioner “for the purposes of” the Data Protection Act 1998, and “for the purposes of” the Freedom of Information Act 2000.

25 May 2018 marked, of course the date from which (by effect of its Article 99) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, or “GDPR“, applied.

Also on 25 May 2018, by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018, section 114 of the Data Protection Act 2018 commenced. This provided (and provides)

There is to continue to be an Information Commissioner.

However, paragraph 44 of schedule 19 to the Data Protection Act 2018 (commenced also by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018) repealed the “FOIA purpose” provisions of section 6(1) of the Data Protection Act 1998 (which, to recall, said that “for the purposes of…the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner“). At the same time, paragraph 59 of schedule 19 to the Data Protection Act 2018 repealed section 18(1) (which had provided that “The Data Protection Commissioner shall be known instead as the Information Commissioner“).

So, the Information Commissioner is no longer described, in statute, as an officer which shall be for the purposes of the Freedom of Information Act 2000.

Probably nothing turns on this. Elsewhere in the Freedom of Information Act 2000 it is clear that the Information Commissioner has various functions, powers and duties, which are not removed by the repeal (and subsequent absence of) the “FOIA purpose” provisions. However, the repeal (and absence) do raise some interesting questions. If Parliament thought it right previously to say that, for the purposes of the Freedom of Information Act 2000 there should have been an Information Commissioner, why does it now think it right not to? No such questions arise when it comes to the data protection laws, because section 114 and schedule 12 of the Data Protection Act 2018, and Articles 57 and 58 of the UK GDPR, clearly define the purposes (for those laws) of the Information Commissioner.

Maybe all of this rather painful crashing through the thickets of the information rights laws is just an excuse for me to build up to a punchline of “what’s the purpose of the Information Commissioner?” But I don’t think that is solely what I’m getting at: the implied uncoupling of the office from its purposes seems odd, and something that could easily have been avoided (or could easily be remedied). If I’m wrong, or am missing something – and I very much invite comment and correction – then I’ll happily withdraw/update this post.

Please note that links to statutes here on the legislation.gov.uk website are generally to versions as they were originally enacted.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, Freedom of Information, GDPR, Information Commissioner

Has ICO “no fines” policy been introduced without proper debate?

At the NADPO annual conference last year Information Commissioner John Edwards discussed his policy of reserving fines under UK GDPR to public bodies only for the most egregious cases. The policy had been announced a few months earlier in an open letter (interestingly addressed to “public sector colleagues”).

Since then, it seems that fines (other than for Privacy and Electronic Communications Regulations (PECR) matters) are – in general – almost off the Information Commissioner’s agenda. Just this week a reprimand – only – was issued to a video sharing platform (the contents of which tend towards the conspiratorial, and the users of which might have particular concerns about exposure) which suffered an exfiltration attack involving 345000 user names, email addresses and passwords.

Earlier this year I made a Freedom of Information request for the evidential basis for Edwards’ policy. The response placed primary focus on a paper entitled “An Introduction to Outcome Based Cooperative Regulation (OBCR)” by Christopher Hodges, from the Centre for Socio-Legal Studies at Oxford. Hodges is also Chair of the government’s Regulatory Horizons Council.

The paper does not present empirical evidence of the effects of fines (or the effects of not-fining) but proposes a staged model (OBCR) of cooperation between businesses (not, one notes, public bodies) and regulators to achieve common purposes and outcomes. OBCR, it says, enables organisations to “opt for basing their activities around demonstrating they can be trusted”. The stages proposed involve agreement amongst all stakeholders of purposes, objectives and desired outcomes, as well as evidence and metrics to identify those outcomes.

But what was notable about Edwards’ policy, was that it arrived without fanfare, and – apparently – without consultation or indeed any involvement of stakeholders. If the aim of OBCR is cooperation, one might reasonably question whether such a failure to consult vitiates, or at least hobbles, the policy from the start.

And, to the extent that the judiciary is one of those stakeholders, it would appear from the judgment of Upper Tribunal Judge Mitchell, in the first GDPR/UK GDPR fining case (concerning the very first GDPR fine in the UK) to reach the appellate courts, that there is not a consensus on the lack of utility of fines. At paragraph 178, when discussing the fact that fines (which are, by section 155 Data Protection Act 2018, “penalty” notices) the judge says

There is clearly also a dissuasive aspect to [monetary penalty notices]. I do not think it can be sensibly disputed that, in general, the prospect of significant financial penalties for breach of data protection requirements makes a controller or processor more likely to eschew a lackadaisical approach to data protection compliance and less likely to take deliberate action in breach of data protection requirements.

This is a statement which should carry some weight, and, to the extent that it is an expression on regulatory theory (which I think it is) it illustrates why a policy such as John Edwards has adopted requires (indeed, required) more of a public debate that it appears to have had.

As the issuing of fines inevitably involves an exercise of discretion, it is essentially impossible to say how many fines have not been issued which would have been, but for the Edwards policy (although it might be possible to look at whether there has – which I suspect there has – been a corresponding increase in “reprimands”, and draw conclusions from that). Nonetheless, some recipients of fines from before the policy was introduced might well reasonably ask themselves whether, had Edwards’ policy been in place at the time, they would have escaped the penalty, and why, through an accident of timing, they were financially punished when others are not. Similarly, those companies which may still receive fines, including under the PECR regime, yet which can convincingly argue that they wish to, and can, demonstrate they can be trusted, might also reasonably asked why they are not being given the opportunity to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, fines, GDPR, Information Commissioner, monetary penalty notice, PECR, rule of law, UK GDPR

ICO guidance on domestic CCTV – more hindrance than help

An article in the Mail on the use of connected doorbells has led me again to one of the oddest pages on the ICO’s website, on the use of domestic CCTV. Odd, because (behoven to the outdated, and frankly somewhat silly, decision of the CJEU in the 2014 Ryneš case) it approaches the issue on the basis that if a camera captures footage outside the curtilage of one’s home, then the home owner cannot avail themselves of the carve-out from the UK GDPR (at Article 2(2)) for “processing of personal data by an individual in the course of a purely personal or household activity”. But the law says nothing at all about the location or visual range of cameras – it is all about the processing purposes.

Also odd is that the ICO goes on to say that people operating CCTV that captures footage beyond their home’s curtilage will be required to comply with data subject rights (such as providing a privacy notice, and responding to access/erasure/stop requests). But, says the ICO, “we probably won’t do anything if people ignore us”:

You can complain to us when a user of domestic CCTV doesn’t follow the rules. We can send a letter asking them to resolve things, eg put up the appropriate signage or respond to data protection requests. 

There is a limited amount of action the ICO can take after this point to make the person comply. It is highly unlikely the ICO will consider it fair or balanced to take enforcement action against a domestic CCTV user.

But oddest of all, the ICO says:

“These rules only apply to fixed cameras. They do not cover roaming cameras, such as drones or dashboard cameras (dashcams) as long as the drone or dashcam is used only for your domestic or household purposes”

I simply don’t understand this distinction between fixed cameras and “roaming” cameras, despite the fact that the ICO states that “data protection law” says this. I’m unaware of any law that provides a basis for the assertion (if anyone knows, please let me know). I would, in fact, be prepared to mount an argument that “roaming” cameras are more, or have the potential to be more, intrusive on others’ rights than fixed cameras.

The Article 2(2) “purely personal or household activity” carve-out is a complex provision, and one that has got the ICO into choppy waters in the past (see the trenchant criticism of Tugendhat J in the “Solicitors from Hell” litigation, at paras 93-101, which considered the similar carve-out under the prior law). There are some very interesting questions and arguments to be considered (especially when the gloss provided by recital 18 is taken into account, with its reference to online personal or household activities also being outwith the material scope of the law). However, the ICO’s guidance here will likely serve only to confuse most householders, and – I suspect – has the potential in some cases to escalate private disputes.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under CCTV, GDPR, Information Commissioner, material scope, privacy notice, surveillance, UK GDPR

Labour’s Grubby Data Grab

Nine years ago (I’ve been doing this a long time) I wrote about the Labour Party harvesting details by hosting a page inviting people to find out “what baby number” they were in relation to the NHS. At that time, no privacy notice information was given at all. Fast forward to today, and Labour is once again hosting a similar page. This time, there is a bit more explanatory information, but it’s far from reassuring.

As an aside, I note that, when a person inputs their date of birth, what the website does is simply calculate, by reference to broad census data, approximately how many babies would have been born since the NHS started and that birth date. So the idea that this gives a “baby number” is ridiculous from the outset.

In any event, the person is then required to give their first name, email address and postcode.

(There is also an odd option to “find out the baby number” of a relative, or friend, by giving that person’s date of birth. Here, the person completing the form is only required to give their own email address and postcode (not their own first name).)

The person completing the form then has the option to agree or not agree to be kept “updated via email on the latest campaigns, events and opportunities to get involved”. This initially seems acceptable when it comes to compliance with the emarketing rules in the Privacy and Electronic Communications (EC Directive) Regulations 2003, so perhaps an improvement on how things were nine years ago. However, in smaller print, the person is then told that “We may use the information you provide, such as name and postcode, to match the data provided to your electoral register record held on our electoral database, which could inform future communications you receive from us”. So it appears that, even if one declines to receive future emails, the party may still try to match one’s details with those on the electoral register and may still send “future communications” (although query how accurate – or even feasible – this will be: how many Johns, say, potentially live in postcode SK9 5AF?).

This suggests that some sort of profiling is going on, but it is all a bit unclear, and opaque, which are not words that really should be associated with the processing of personal data by a political party. But if one clicks the link to “know more about how we use your information” the first thing one encounters is a cookie banner with no option but to accept cookies (which will, it is said, help the party make its website better). Such a banner is, of course, not lawful, and – if the ICO is to be believed – puts the party at current risk of enforcement action. If, teeth gritted, one clicks through the banner, one is faced with a privacy notice which, dear readers, I think needs to be the subject of a further blog (maybe with a comparative analysis of other parties’ notices). Suffice to say that the Labour Party appears to be doing one heck of a lot of profiling, and “estimation” of political opinions, from a range of statutory and/or public information sources.

For now, the TL;DR of this post is that the “NHS Baby Number” schtick from the Labour Party seems to be as much of a (although maybe a different) grubby data grab as it was nine years ago when I last wrote about it. There’s a lot that the ICO could, and should, do about it, but nothing was done then, and – I fear – nothing will be done now.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under fairness, Information Commissioner, PECR, political parties, privacy notice, profiling

NADPO June Webinar

NADPO’s next webinar is tomorrow, June 27, featuring 

Lucas Amin, Open Democracy – “the NHS Federated Data Platform – risks and opportunities”
Dr Luc Rocher, Oxford Internet Institute – “The hazards of proxy data sets

Members should already have (just) been sent the link, and it is also available in the members’ zone on our website. As usual, there are one or two free spaces if anyone wants to test the waters. Contact me at chair at nadpo dot co dot uk if you’d like to request one.

https://nadpo.co.uk/event/nadpo-june-webinar/

Leave a comment

Filed under Uncategorized

Typo in the GDPR

A small thing, to please small minds.

As I was looking at the excellent version of the UK GDPR on the Mishcon de Reya website (plaudits, and a salary increase, for the person who created it), I noticed odd wording at Article 23(1)(e)

…including monetary, budgetary and taxation a matters, public health…

“taxation a matters”? Oh dear – salary decrease for whoever typed that?

However, I then saw that the version of the UK GDPR on the legislation.gov.uk pages has the same odd wording.

At that point, my national pride was concerned. Did the UK screw up its retention of the EU GDPR? But no – pride restored! plaudits restored! salary increase merited! The silly old drafters of the original GDPR had done the original typo, which has carried through. The Official Journal of the European Union bears the original sin

I surely can’t be the first person to have noticed this. But a cursory Google search didn’t show anyone else mentioning it. So I’m going to claim it. With all the associated plaudits.

Leave a comment

Filed under accuracy, Data Protection, GDPR, UK GDPR

ICO to issue fines for non-compliant cookie banners?

That is what they are warning. Let’s see what transpires. New post by me on the Mishcon de Reya website:

https://www.mishcon.com/news/ico-warns-of-fines-for-companies-who-do-not-get-cookie-banners-right

Leave a comment

Filed under Uncategorized

NADPO May Webinar

NADPO’s May lunchtime webinar is at 13:30 on 23 May. Speakers and topics are

Prof Mark Elliott, UKAN – “Anonymisation: What is it? And how do I do it?”
Karen Levy, Cornell University – “Robotruckers: The double threat of AI for low-waged work”

As usual, there are a couple of free spaces for anyone who wants to test the NADPO waters. Email me on chair at nadpo dot co dot uk if you’re interested.

Leave a comment

Filed under Uncategorized

ICO “does not use AI” – really?

There’s an interesting Freedom of Information (FOI) response by the Information Commissioner’s Office (ICO) on the website WhatDoTheyKnow. In response to the question

have you examined the use of AI to help you in doing your work as an organisation?

their reply includes the statement that

For information, the ICO does not use any artificial intelligence (“AI”) technology.

However, if one uses most of the standard definitions of AI (such as the one from the government’s National AI Strategy: “machines that perform tasks normally requiring human intelligence, especially when the machines learn from data how to do those tasks”) one might find that hard to believe. What about spam filters on the ICO email network? Or the fact they recommend Google Maps for anyone needing directions to their offices? Or their corporate use of social media? All of those technologies use, or constitute, AI.

There is a wider point here: the task of regulating AI, or even of comprehending how it uses personal data, will fall increasingly on some key regulators in coming years (including the ICO). It is going to be crucial that there is understanding within those organisations of these issues, and if they don’t comprehend now how, within their own walls, the technology operates, they will be starting off on the back foot.

(One should also add that, if the ICO has missed some of its own more obvious uses of AI, then it has probably also failed to respond to the FOI request in accordance with the law.)

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under AI, Freedom of Information, Information Commissioner