The Information Commissioner’s Office (ICO) has published reprimands against seven separate organisations all of whom committed serious infringements of data protection law by inadvertently disclosing highly sensitive information in the context of cases involving victims of domestic abuse.
The ICO trumpets the announcement, but does not appear to consider the point that, until recently, most, if not all, of these infringements would have resulted in a hefty fine, not a regulatory soft tap on the wrist. Nor does it contemplate the argument that precisely this sort of light-touch regulation might lead to more of these sorts of incidents, if organisations believe they can act (or fail to act) with impunity.
I think it is incumbent on the Information Commissioner, John Edwards, to answer this question: are you confident that your approach is not leading to poorer compliance?
The cases include
Four cases of organisations revealing the safe addresses of the victims to their alleged abuser. In one case a family had to be immediately moved to emergency accommodation.
Revealing identities of women seeking information about their partners to those partners.
Disclosing the home address of two adopted children to their birth father, who was in prison on three counts of raping their mother.
Sending an unredacted assessment report about children at risk of harm to their mother’s ex-partners.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
Answer – when Parliament approveslegislation to remove it
Rather quietly, the government is introducing secondary legislation which will have the effect of removing the (admittedly odd) situation whereby the UK GDPR describes the right to protection of personal data as a fundamental right.
Currently, Article 1(2) of the UK GDPR says “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data”. For the purposes of the EU GDPR this makes sense (and made sense when the UK was part of the EU) because the Charter of Fundamental Rights of the European Union (“the Charter”) identifies the right to protection of personal data as a free-standing right.
There is no direct equivalent to the right to the protection of personal data in the UK law. However, the protection of personal data falls within the right to respect for private and family life under Article 8 of the European Convention of Human Rights, which is enshrined in UK law by the Human Rights Act 1998. Data protection rights are also protected by UK GDPR, the Data Protection Act 2018 and will continue to be protected by the Data Protection and Digital Information Bill in our domestic legislation.
None of this addresses the point that the EU specifically decided, in the Charter, to separate the right to protection of personal data from the right to respect for a private and family life. One reason being that sometimes personal data is not notably, or inherently, private, but might, for instance, be a matter of public record, or in the public domain, yet still merit protection.
The explanatory memorandum also says, quite understandably, that the UK GDPR has to be amended so as to ensure that
references to retained EU rights and freedoms which would become redundant at the end of 2023 are replaced with references to rights under the European Convention on Human Rights (ECHR) which has been enshrined in the UK’s domestic law under the Human Rights Act 1998
Nonetheless, it was interesting for a while that the UK had a fundamental right in its domestic legislation that was uncoupled from its source instrument – but that, it seems, will soon be gone.
There is no update. Nothing from the ICO at all, other than, at four weeks – after chasing – a message saying it’s taking six to eight weeks to allocate cases.
don’t have “reject all” on your top level [cookie banner]…are breaking the law. ..There is no excuse for that. The ICO is paying attention in this area and will absolutely issue fines if we see organizations are not taking that seriously and taking steps.
Having a ‘reject all’ button on a cookies banner that is just as prominent as an ‘accept all’ button helps people to more easily exercise their information rights. The ICO is closely monitoring how cookie banners are used in the UK and invites industry to review their cookies compliance now. If the ICO finds that cookies banners breach the law, it will seriously consider using the full range of its powers, including fines.
Then, on 9 August, in conjunction with the Competition and Markets Authority, your office stated
One clear example of often harmful design are cookie consent banners. A website’s cookie banner should make it as easy to reject non-essential cookies as it is to accept them. Users should be able to make an informed choice on whether they want to give consent for their personal information to be used, for example, to profile them for targeted advertising. The ICO will be assessing cookie banners of the most frequently used websites in the UK, and taking action where harmful design is affecting consumers.
In view of all of these statements, I wish to complain, under Article 77 UK GDPR, and simultaneously request, under regulation 32 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”), that you exercise your enforcement functions, in relation to the use of cookies and similar technology by Associated Newspapers Limited, or alternatively DMG Media (whichever is applicable) as controller of, and person responsible for confidentiality of communications on, the “MailOnline” website at https://www.dailymail.co.uk/home/index.html (the “Website”).
The Website presents a visitor using the Safari browser on an iPhone 11 Pro with a “cookie banner” (see attached screenshot) which does not offer visitors a “reject all” option.
Furthermore, the whole set-up is opaque. If one clicks “Cookie Settings” one is faced with an initially straightforward set of options (one of them set by default to accept cookies for personalised advertising on the basis of “legitimate interest”, which is clearly not compliant with regulation 6 of PECR). However, if one then clicks on the tab for “Vendors”, one is faced with a frankly farcically long list of such “vendors”, and options, many of them set by default to “legitimate interest”. I consider myself reasonably knowledgeable in this area, but it is far from clear what is actually going on, other than to say it plainly appears to be falling short of compliance with regulation 6, and, to the extent my personal data is being processed, the processing plainly appears to be in contravention of the UK GDPR, for want – at least – of fairness, lawful basis and transparency.
It is worth noting that much of MailOnline’s content is likely to be of interest to and accessed by children (particularly its sports and “celebrity news” content), even if the publisher does not actively target children. You state, in your guidance
if children are likely to access your service you will need to ensure that both the information you provide and the consent mechanism you use are appropriate for children.
But the complexity and opacity of the Website’s cookie use means that it is largely incomprehensible to adults, let alone children.
It is, obviously, not for me to specify how you undertake an investigation of my complaint, but you must, of course, by reference to Article 57(1)(f) UK GDPR, investigate to the “extent appropriate”. Given the clear messages your office has delivered about cookie banners and the like, and given the weight of evidence as to non-compliance, I would suggest an investigation to the extent appropriate must – at the very least – result in a clear finding as to legality, with reasons, and recommendations for the investigated party.
I cannot claim to be distressed by the infringements I allege, but I do claim to be irritated, and to have, cumulatively, been put to excess time and effort repeatedly trying to “opt out” of receiving cookies on the Website and understand what sort of processing is being undertaken, and what sort of confidentiality of communications exists on it.
Of course the Website here is not the only example of apparent non-compliance: poor practice is rife. Arguably, it is rife because of a prolonged unwillingness by your office and your predecessors to take firm action. However, if you would like me to refer to other examples, or require any further information, please don’t hesitate to ask.
Yours sincerely
Jon Baines
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
I’d like you to imagine two people (Person A and Person B). Both receive an unsolicited direct marketing call to their personal mobile phone, in which the caller says the recipient’s name (e.g. “am I speaking to Jon Baines?”) Both are registered with the Telephone Preference Service. Both are aggrieved at receiving the unlawful call.
Person A knows nothing much about electronic marketing laws, and nothing much about data protection law. But, to them, quite reasonably, the call would seem to offend their data protection rights (the caller has their name, and their number). They do know that the Information Commissioner enforces the data protection laws.
Person B knows a lot about electronic marketing and data protection law. They know that the unsolicited direct marketing call was not just an infringement of the Privacy and Electronic Communications (EC Directive) Regulations 2003, but also involved the processing of their personal data, thus engaging the UK GDPR.
Both decide to complain to the Information Commissioner’s Office (ICO). Both see this page on the ICO website
They see a page for reporting Nuisance calls and messages, and, so, fill in the form on that page.
And never hear anything more.
Why? Because, as the subsequent page says “We will use the information you provide to help us investigate and take action against those responsible. We don’t respond to complaints individually” (emphasis added).
But isn’t this a problem? If Person A’s and Person B’s complaints are (as they seem to be) “hybrid” PECR and UK GDPR complaints, then Article 57(1)(f) of the latter requires the ICO to
handle complaints lodged by a data subject…and investigate, to the extent appropriate, the subject matter of the complaint and inform the complainant of the progress and the outcome of the investigation within a reasonable period (emphasis added)
What Article 57(1)(f) and the words “investigate, to the extent appropriate” mean, has been the subject of quite a bit of litigation in recent years (the basic summary of which is that the ICO has broad discretion as to how to investigate, and even a mere decision to cease handling a complaint will be likely to suffice (see Killock & Veale & others v Information Commissioner(GI/113/2021 & others)).
But nowhere has anyone suggested that ICO can simply decide not to “inform the complainant of the progress and the outcome of the investigation”, in hybrid complaints like the Person A’s and Person B’s would be.
Yet that is what undoubtedly happens in many cases. And – it strikes me – it has happened to me countless times (I have complained about many, many unsolicited calls over the years, but never heard anything of the progress and outcome). Maybe you might say that I (who, after all, have found time to think about and write this post) can’t play the innocent. But I strongly believe that there are lots of Person As (and a fair few Person Bs) who would, if they knew that – to the extent theirs is a UK GDPR complaint – the law obliges the ICO to investigate and inform them of the progress and the outcome of that investigation, rightly feel aggrieved to have heard nothing.
This isn’t just academic: unsolicited direct marketing is the one area that the ICO still sees as worthy of fines (all but two of the twenty-three fines in the last year have been under that regime). So a complaint about such a practice is potentially a serious matter. Sometimes, a single complaint about such marketing has resulted in a large fine for the miscreant, yet – to the extent that the issue is also a UK GDPR one – the complainant themselves often never hears directly about the complaint.
In addition to the Killock & Veale case, there have been a number of cases looking at the limits to (and discretion regarding) ICO’s investigation of complaints. As far as I know no one has actually yet raised what seems to be a plain failure to investigate and inform in these “hybrid” PECR and UK GDPR cases.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
At the NADPO annual conference last year Information Commissioner John Edwards discussed his policy of reserving fines under UK GDPR to public bodies only for the most egregious cases. The policy had been announced a few months earlier in an open letter (interestingly addressed to “public sector colleagues”).
Since then, it seems that fines (other than for Privacy and Electronic Communications Regulations (PECR) matters) are – in general – almost off the Information Commissioner’s agenda. Just this week a reprimand – only – was issued to a video sharing platform (the contents of which tend towards the conspiratorial, and the users of which might have particular concerns about exposure) which suffered an exfiltration attack involving 345000 user names, email addresses and passwords.
Earlier this year I made a Freedom of Information request for the evidential basis for Edwards’ policy. The response placed primary focus on a paper entitled “An Introduction to Outcome Based Cooperative Regulation (OBCR)” by Christopher Hodges, from the Centre for Socio-Legal Studies at Oxford. Hodges is also Chair of the government’s Regulatory Horizons Council.
The paper does not present empirical evidence of the effects of fines (or the effects of not-fining) but proposes a staged model (OBCR) of cooperation between businesses (not, one notes, public bodies) and regulators to achieve common purposes and outcomes. OBCR, it says, enables organisations to “opt for basing their activities around demonstrating they can be trusted”. The stages proposed involve agreement amongst all stakeholders of purposes, objectives and desired outcomes, as well as evidence and metrics to identify those outcomes.
But what was notable about Edwards’ policy, was that it arrived without fanfare, and – apparently – without consultation or indeed any involvement of stakeholders. If the aim of OBCR is cooperation, one might reasonably question whether such a failure to consult vitiates, or at least hobbles, the policy from the start.
And, to the extent that the judiciary is one of those stakeholders, it would appear from the judgment of Upper Tribunal Judge Mitchell, in the first GDPR/UK GDPR fining case (concerning the very first GDPR fine in the UK) to reach the appellate courts, that there is not a consensus on the lack of utility of fines. At paragraph 178, when discussing the fact that fines (which are, by section 155 Data Protection Act 2018, “penalty” notices) the judge says
There is clearly also a dissuasive aspect to [monetary penalty notices]. I do not think it can be sensibly disputed that, in general, the prospect of significant financial penalties for breach of data protection requirements makes a controller or processor more likely to eschew a lackadaisical approach to data protection compliance and less likely to take deliberate action in breach of data protection requirements.
This is a statement which should carry some weight, and, to the extent that it is an expression on regulatory theory (which I think it is) it illustrates why a policy such as John Edwards has adopted requires (indeed, required) more of a public debate that it appears to have had.
As the issuing of fines inevitably involves an exercise of discretion, it is essentially impossible to say how many fines have not been issued which would have been, but for the Edwards policy (although it might be possible to look at whether there has – which I suspect there has – been a corresponding increase in “reprimands”, and draw conclusions from that). Nonetheless, some recipients of fines from before the policy was introduced might well reasonably ask themselves whether, had Edwards’ policy been in place at the time, they would have escaped the penalty, and why, through an accident of timing, they were financially punished when others are not. Similarly, those companies which may still receive fines, including under the PECR regime, yet which can convincingly argue that they wish to, and can, demonstrate they can be trusted, might also reasonably asked why they are not being given the opportunity to do so.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
An article in the Mail on the use of connected doorbells has led me again to one of the oddest pages on the ICO’s website, on the use of domestic CCTV. Odd, because (behoven to the outdated, and frankly somewhat silly, decision of the CJEU in the 2014 Ryneš case) it approaches the issue on the basis that if a camera captures footage outside the curtilage of one’s home, then the home owner cannot avail themselves of the carve-out from the UK GDPR (at Article 2(2)) for “processing of personal data by an individual in the course of a purely personal or household activity”. But the law says nothing at all about the location or visual range of cameras – it is all about the processing purposes.
Also odd is that the ICO goes on to say that people operating CCTV that captures footage beyond their home’s curtilage will be required to comply with data subject rights (such as providing a privacy notice, and responding to access/erasure/stop requests). But, says the ICO, “we probably won’t do anything if people ignore us”:
You can complain to us when a user of domestic CCTV doesn’t follow the rules. We can send a letter asking them to resolve things, eg put up the appropriate signage or respond to data protection requests.
There is a limited amount of action the ICO can take after this point to make the person comply. It is highly unlikely the ICO will consider it fair or balanced to take enforcement action against a domestic CCTV user.
But oddest of all, the ICO says:
“These rules only apply to fixed cameras. They do not cover roaming cameras, such as drones or dashboard cameras (dashcams) as long as the drone or dashcam is used only for your domestic or household purposes”
I simply don’t understand this distinction between fixed cameras and “roaming” cameras, despite the fact that the ICO states that “data protection law” says this. I’m unaware of any law that provides a basis for the assertion (if anyone knows, please let me know). I would, in fact, be prepared to mount an argument that “roaming” cameras are more, or have the potential to be more, intrusive on others’ rights than fixed cameras.
The Article 2(2) “purely personal or household activity” carve-out is a complex provision, and one that has got the ICO into choppy waters in the past (see the trenchant criticism of Tugendhat J in the “Solicitors from Hell” litigation, at paras 93-101, which considered the similar carve-out under the prior law). There are some very interesting questions and arguments to be considered (especially when the gloss provided by recital 18 is taken into account, with its reference to online personal or household activities also being outwith the material scope of the law). However, the ICO’s guidance here will likely serve only to confuse most householders, and – I suspect – has the potential in some cases to escalate private disputes.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
As I was looking at the excellent version of the UK GDPR on the Mishcon de Reya website (plaudits, and a salary increase, for the person who created it), I noticed odd wording at Article 23(1)(e)
…including monetary, budgetary and taxation a matters, public health…
“taxation a matters”? Oh dear – salary decrease for whoever typed that?
At that point, my national pride was concerned. Did the UK screw up its retention of the EU GDPR? But no – pride restored! plaudits restored! salary increase merited! The silly old drafters of the original GDPR had done the original typo, which has carried through. The Official Journal of the European Union bears the original sin
I surely can’t be the first person to have noticed this. But a cursory Google search didn’t show anyone else mentioning it. So I’m going to claim it. With all the associated plaudits.
In August 2022 the Solicitors Regulation Authority (SRA) announced plans to change its rules and reinstate the annual “keeping of the roll” exercise. Until 2014, all solicitors without practising certificates were required to complete an application each year and pay an administration fee if they wished to remain on the roll. This requirement was dispensed with in 2014 in part because the annual process was seen as burdensome for solicitors.
One of the justifications now for reintroducing the keeping of the roll is given by the SRA as
There are also requirements under the General Data Protection Regulation (GDPR) 2016 [sic] and the seven principles that govern the holding and retention of data. Under GDPR we have responsibility as a data controller to ensure we maintain accurate data relating to individuals and we are processing it fairly and lawfully.
What is slightly odd is that when, in 2014, the SRA proposed to scrap the keeping of the roll, it was not troubled by the observations of the then Information Commissioner about the importance of accuracy and privacy of information. In its reply to the then Commissioner’s consultation response it said that it had “fully considered the issues” and
We consider that the availability of the SRA’s online system, mySRA, to non- practising solicitors as a means of keeping their details up to date, serves to mitigate the possibility of data become inaccurate…To further mitigate the risk of deterioration of the information held on the roll, the SRA can include reminders to keep contact details up to date in standard communications sent to solicitors.
If that was the position in 2014, it is difficult to understand why it is any different today. The data protection principles – including the “accuracy principle” – in the UK GDPR (not in fact the “GDPR 2016” that the SRA refers to) are effectively identical to those in the prior Data Protection Act 1998.
If the SRA was not concerned by data protection considerations in 2014 but is so now, one might argue that it should explain why. The Information Commissioner does not appear to have responded to the consultation this time around, so there is no indication that his views swayed the SRA.
If the SRA was concerned about the risk of administrative fines (potentially larger under the UK GDPR than under the Data Protection Act 1998) it should have reassured itself that any such fines must be proportionate (Article 83(1) UK GDPR) and by the fact that the Commissioner has repeatedly stressed that he is not in the business of handing out fines for minor infringements to otherwise responsible data controllers.
I should emphasise that data protection considerations were not the only ones taken into account by the SRA, and I don’t wish to discuss whether, in the round, the decision to reintroduce the keeping of the roll was correct or not (Joshua Rozenberg has written on this, and the effect on him). But I do feel that the arguments around data protection show a confused approach to that particular issue.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
The Conservative Party, no doubt scrabbling to gather perceived support for its contentious immigration policies and measures is running a web and social media campaign. The web page encourages those visiting it to “back our plan and send a message” to other parties:
Further down the page visitors are invited to “send Labour a message”
Clicking on either of the red buttons in those screenshots results in a pop-up form, on which one can say whether or not one supports the Tory plans (in the screenshot below, I’ve selected “no”)
One is then required to give one’s name, email address and postcode, and there is a tick box against text saying “I agree to the Conservative Party, and the wider Conservative Party, using the information I provide to keep me updated via email about the Party’s campaigns and opportunities to get involved”
There are two things to note.
First, the form appears to submit whether one ticks the “I agree” box or not.
Second, and in any case, none of the links to “how we use your data”, or the “privacy policy”, or the “terms and conditions” works.
So anyone submitting their special category data (information about one’s views on a political party’s policies on immigration is personal data revealing political opinions, and so Article 9 UK GDPR applies) has no idea whatsoever how it will subsequently be processed by the Tories.
I suppose there is an argument that anyone who happens upon this page, and chooses to submit the form, has a good idea what is going on (although that is by no means certain, and people could quite plausibly think that it provides an opportunity to provide views contrary to the Tories’). In any event, it would seem potentially to meet to definition of “plugging” (political lobbying under the guide of research) which ICO deals with in its direct marketing guidance.
Also in any event, the absence of any workable links to privacy notice information means, unavoidably, that the lawfulness of any subsequent processing is vitiated.
It’s the sort of thing I would hope the ICO is alive to (I’ve seen people on social media saying they have complained to ICO). But I won’t hold my breath on that – many years ago I wrote about how such data abuse was rife across the political spectrum – but little if anything has changed.
And finally, the most remarkable thing of all is that I’ve written a whole post on what is a pressing and high-profile issue without once mentioning Gary Lineker.
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
There is something that distinguishes those who have practised data protection law for more than five years and those who have come to it more recently. The former are in possession of a secret. It is this: GDPR did not change the fundamentals of data protection.
Look at the keystones of the law – the data protection principles in Schedule One of the Data Protection Act of 1998 (the prior law) and in Article 5 UK GDPR (the current). They are effectively identical. And in fact, they have barely changed from the principles in the 1984 Data Protection Act, and those in the Council of Europe Data Protection Convention 108 of 1981.
Yet even in the courts one still sees from time to time the misconception that the GDPR rights and obligations were something fundamentally new.
An example is a recent case in the Employment Appeal Tribunal. The details of the case are not important for this post, but what is relevant is that the claimant employee argued that information about his previous employment history at the respondent employer (from 2008-2011) should not have been allowed in evidence. One argument in support of this was that the lengthy retention of this information was in breach of the employer’s data protection obligations (and the claimant had received correspondence from the Information Commissioner’s Office broadly agreeing with this).
But in response to this argument the respondent employer asserted that
Prior to [GDPR coming into effect on 25 May 2018] there was no “right to erase“. Accordingly, the period during which the respondent should arguably have taken steps to delete data was around nine months from this point until 28 February 2019.
This fails to recognise that, even if there was no express right to erasure prior to GDPR (n.b. there was certainly an implied right, as the European Court of Justice found in Google Spain) there was certainly an obligation on a data controller employer not to retain personal data for longer than was necessary (see paragraph 5 Schedule One to the 1998 Act).
The judge, however, accepted the respondent’s argument (although in all fairness to her she does point out that neither party took her to the legislation or the case law):
I accept that the ICO’s reference to retention being likely to breach data protection requirements, was (at its highest) concerned with the nine month period between the GDPR coming into effect and the claimant indicating an intention to commence litigation
That is not what the the quoted correspondence (at paragraph 17) from the ICO said, and it is not a correct statement of the law. If the period of retention of the data was excessive, there is no reason to say it was not in contravention of the prior law, as well as GDPR.
Ultimately, it is doubtful that this would have made much difference. As often in such proceedings, the relevance of the information to the matter was key:
in so far as the Respondent was in breach of data protection law for the nine month period I have referred to, it does not follow from this that the documentation was inadmissible in the [Employment Tribunal] proceedings
But one wonders if the judge might have taken a slightly different view of, instead, she had found that the Respondent was in fact in breach of data protection law for several years (rather than just nine months).
The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.