Tag Archives: ICO

ICO failing to inform complainants of investigation outcomes

I’d like you to imagine two people (Person A and Person B). Both receive an unsolicited direct marketing call to their personal mobile phone, in which the caller says the recipient’s name (e.g. “am I speaking to Jon Baines?”) Both are registered with the Telephone Preference Service. Both are aggrieved at receiving the unlawful call.

Person A knows nothing much about electronic marketing laws, and nothing much about data protection law. But, to them, quite reasonably, the call would seem to offend their data protection rights (the caller has their name, and their number). They do know that the Information Commissioner enforces the data protection laws.

Person B knows a lot about electronic marketing and data protection law. They know that the unsolicited direct marketing call was not just an infringement of the Privacy and Electronic Communications (EC Directive) Regulations 2003, but also involved the processing of their personal data, thus engaging the UK GDPR.

Both decide to complain to the Information Commissioner’s Office (ICO). Both see this page on the ICO website

 

They see a page for reporting Nuisance calls and messages, and, so, fill in the form on that page.

And never hear anything more.

Why? Because, as the subsequent page says “We will use the information you provide to help us investigate and take action against those responsible. We don’t respond to complaints individually” (emphasis added).

But isn’t this a problem? If Person A’s and Person B’s complaints are (as they seem to be) “hybrid” PECR and UK GDPR complaints, then Article 57(1)(f) of the latter requires the ICO to

handle complaints lodged by a data subject…and investigate, to the extent appropriate, the subject matter of the complaint and inform the complainant of the progress and the outcome of the investigation within a reasonable period (emphasis added)

What Article 57(1)(f) and the words “investigate, to the extent appropriate” mean, has been the subject of quite a bit of litigation in recent years (the basic summary of which is that the ICO has broad discretion as to how to investigate, and even a mere decision to cease handling a complaint will be likely to suffice (see Killock & Veale & others v Information Commissioner (GI/113/2021 & others)).

But nowhere has anyone suggested that ICO can simply decide not to “inform the complainant of the progress and the outcome of the investigation”, in hybrid complaints like the Person A’s and Person B’s would be.

Yet that is what undoubtedly happens in many cases. And – it strikes me – it has happened to me countless times (I have complained about many, many unsolicited calls over the years, but never heard anything of the progress and outcome). Maybe you might say that I (who, after all, have found time to think about and write this post) can’t play the innocent. But I strongly believe that there are lots of Person As (and a fair few Person Bs) who would, if they knew that – to the extent theirs is a UK GDPR complaint –  the law obliges the ICO to investigate and inform them of the progress and the outcome of that investigation, rightly feel aggrieved to have heard nothing.

This isn’t just academic: unsolicited direct marketing is the one area that the ICO still sees as worthy of fines (all but two of the twenty-three fines in the last year have been under that regime). So a complaint about such a practice is potentially a serious matter. Sometimes, a single complaint about such marketing has resulted in a large fine for the miscreant, yet – to the extent that the issue is also a UK GDPR one – the complainant themselves often never hears directly about the complaint.

In addition to the Killock & Veale case, there have been a number of cases looking at the limits to (and discretion regarding) ICO’s investigation of complaints. As far as I know no one has actually yet raised what seems to be a plain failure to investigate and inform in these “hybrid” PECR and UK GDPR cases.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, PECR, UK GDPR

Has the Information Commissioner’s Office lost its FOI purposes?

When Parliament passed the Data Protection Act 1984 it created a role of a regulator for that new data protection law. Section 3(1)(a) said that

For the purposes of this Act there shall be…an officer known as the Data Protection Registrar

The office remained in this form until the passing of the Data Protection Act 1998, section 6(1) of which provided that

The office originally established by section 3(1)(a) of the Data Protection Act 1984 as the office of Data Protection Registrar shall continue to exist for the purposes of this Act but shall be known as the office of Data Protection Commissioner

The advent of the Freedom of Information Act 2000 necessitated a change, so as to create a role of regulator for that Act. Paragraph 13(2) of Schedule 2 to the Freedom of Information Act 2000 amended section 6(1) of the Data Protection Act 1998 so it read

For the purposes of this Act and of the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner

So, at this point, and indeed, until 25 May 2018, there was an Information Commissioner “for the purposes of” the Data Protection Act 1998, and “for the purposes of” the Freedom of Information Act 2000.

25 May 2018 marked, of course the date from which (by effect of its Article 99) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, or “GDPR“, applied.

Also on 25 May 2018, by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018, section 114 of the Data Protection Act 2018 commenced. This provided (and provides)

There is to continue to be an Information Commissioner.

However, paragraph 44 of schedule 19 to the Data Protection Act 2018 (commenced also by effect of the Data Protection Act 2018 (Commencement No. 1 and Transitional and Saving Provisions) Regulations 2018) repealed the “FOIA purpose” provisions of section 6(1) of the Data Protection Act 1998 (which, to recall, said that “for the purposes of…the Freedom of Information Act 2000 there shall be an officer known as the Information Commissioner“). At the same time, paragraph 59 of schedule 19 to the Data Protection Act 2018 repealed section 18(1) (which had provided that “The Data Protection Commissioner shall be known instead as the Information Commissioner“).

So, the Information Commissioner is no longer described, in statute, as an officer which shall be for the purposes of the Freedom of Information Act 2000.

Probably nothing turns on this. Elsewhere in the Freedom of Information Act 2000 it is clear that the Information Commissioner has various functions, powers and duties, which are not removed by the repeal (and subsequent absence of) the “FOIA purpose” provisions. However, the repeal (and absence) do raise some interesting questions. If Parliament thought it right previously to say that, for the purposes of the Freedom of Information Act 2000 there should have been an Information Commissioner, why does it now think it right not to? No such questions arise when it comes to the data protection laws, because section 114 and schedule 12 of the Data Protection Act 2018, and Articles 57 and 58 of the UK GDPR, clearly define the purposes (for those laws) of the Information Commissioner.

Maybe all of this rather painful crashing through the thickets of the information rights laws is just an excuse for me to build up to a punchline of “what’s the purpose of the Information Commissioner?” But I don’t think that is solely what I’m getting at: the implied uncoupling of the office from its purposes seems odd, and something that could easily have been avoided (or could easily be remedied). If I’m wrong, or am missing something – and I very much invite comment and correction – then I’ll happily withdraw/update this post.

Please note that links to statutes here on the legislation.gov.uk website are generally to versions as they were originally enacted.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, Freedom of Information, GDPR, Information Commissioner

Has ICO “no fines” policy been introduced without proper debate?

At the NADPO annual conference last year Information Commissioner John Edwards discussed his policy of reserving fines under UK GDPR to public bodies only for the most egregious cases. The policy had been announced a few months earlier in an open letter (interestingly addressed to “public sector colleagues”).

Since then, it seems that fines (other than for Privacy and Electronic Communications Regulations (PECR) matters) are – in general – almost off the Information Commissioner’s agenda. Just this week a reprimand – only – was issued to a video sharing platform (the contents of which tend towards the conspiratorial, and the users of which might have particular concerns about exposure) which suffered an exfiltration attack involving 345000 user names, email addresses and passwords.

Earlier this year I made a Freedom of Information request for the evidential basis for Edwards’ policy. The response placed primary focus on a paper entitled “An Introduction to Outcome Based Cooperative Regulation (OBCR)” by Christopher Hodges, from the Centre for Socio-Legal Studies at Oxford. Hodges is also Chair of the government’s Regulatory Horizons Council.

The paper does not present empirical evidence of the effects of fines (or the effects of not-fining) but proposes a staged model (OBCR) of cooperation between businesses (not, one notes, public bodies) and regulators to achieve common purposes and outcomes. OBCR, it says, enables organisations to “opt for basing their activities around demonstrating they can be trusted”. The stages proposed involve agreement amongst all stakeholders of purposes, objectives and desired outcomes, as well as evidence and metrics to identify those outcomes.

But what was notable about Edwards’ policy, was that it arrived without fanfare, and – apparently – without consultation or indeed any involvement of stakeholders. If the aim of OBCR is cooperation, one might reasonably question whether such a failure to consult vitiates, or at least hobbles, the policy from the start.

And, to the extent that the judiciary is one of those stakeholders, it would appear from the judgment of Upper Tribunal Judge Mitchell, in the first GDPR/UK GDPR fining case (concerning the very first GDPR fine in the UK) to reach the appellate courts, that there is not a consensus on the lack of utility of fines. At paragraph 178, when discussing the fact that fines (which are, by section 155 Data Protection Act 2018, “penalty” notices) the judge says

There is clearly also a dissuasive aspect to [monetary penalty notices]. I do not think it can be sensibly disputed that, in general, the prospect of significant financial penalties for breach of data protection requirements makes a controller or processor more likely to eschew a lackadaisical approach to data protection compliance and less likely to take deliberate action in breach of data protection requirements.

This is a statement which should carry some weight, and, to the extent that it is an expression on regulatory theory (which I think it is) it illustrates why a policy such as John Edwards has adopted requires (indeed, required) more of a public debate that it appears to have had.

As the issuing of fines inevitably involves an exercise of discretion, it is essentially impossible to say how many fines have not been issued which would have been, but for the Edwards policy (although it might be possible to look at whether there has – which I suspect there has – been a corresponding increase in “reprimands”, and draw conclusions from that). Nonetheless, some recipients of fines from before the policy was introduced might well reasonably ask themselves whether, had Edwards’ policy been in place at the time, they would have escaped the penalty, and why, through an accident of timing, they were financially punished when others are not. Similarly, those companies which may still receive fines, including under the PECR regime, yet which can convincingly argue that they wish to, and can, demonstrate they can be trusted, might also reasonably asked why they are not being given the opportunity to do so.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, fines, GDPR, Information Commissioner, monetary penalty notice, PECR, rule of law, UK GDPR

ICO guidance on domestic CCTV – more hindrance than help

An article in the Mail on the use of connected doorbells has led me again to one of the oddest pages on the ICO’s website, on the use of domestic CCTV. Odd, because (behoven to the outdated, and frankly somewhat silly, decision of the CJEU in the 2014 Ryneš case) it approaches the issue on the basis that if a camera captures footage outside the curtilage of one’s home, then the home owner cannot avail themselves of the carve-out from the UK GDPR (at Article 2(2)) for “processing of personal data by an individual in the course of a purely personal or household activity”. But the law says nothing at all about the location or visual range of cameras – it is all about the processing purposes.

Also odd is that the ICO goes on to say that people operating CCTV that captures footage beyond their home’s curtilage will be required to comply with data subject rights (such as providing a privacy notice, and responding to access/erasure/stop requests). But, says the ICO, “we probably won’t do anything if people ignore us”:

You can complain to us when a user of domestic CCTV doesn’t follow the rules. We can send a letter asking them to resolve things, eg put up the appropriate signage or respond to data protection requests. 

There is a limited amount of action the ICO can take after this point to make the person comply. It is highly unlikely the ICO will consider it fair or balanced to take enforcement action against a domestic CCTV user.

But oddest of all, the ICO says:

“These rules only apply to fixed cameras. They do not cover roaming cameras, such as drones or dashboard cameras (dashcams) as long as the drone or dashcam is used only for your domestic or household purposes”

I simply don’t understand this distinction between fixed cameras and “roaming” cameras, despite the fact that the ICO states that “data protection law” says this. I’m unaware of any law that provides a basis for the assertion (if anyone knows, please let me know). I would, in fact, be prepared to mount an argument that “roaming” cameras are more, or have the potential to be more, intrusive on others’ rights than fixed cameras.

The Article 2(2) “purely personal or household activity” carve-out is a complex provision, and one that has got the ICO into choppy waters in the past (see the trenchant criticism of Tugendhat J in the “Solicitors from Hell” litigation, at paras 93-101, which considered the similar carve-out under the prior law). There are some very interesting questions and arguments to be considered (especially when the gloss provided by recital 18 is taken into account, with its reference to online personal or household activities also being outwith the material scope of the law). However, the ICO’s guidance here will likely serve only to confuse most householders, and – I suspect – has the potential in some cases to escalate private disputes.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under CCTV, GDPR, Information Commissioner, material scope, privacy notice, surveillance, UK GDPR

Labour’s Grubby Data Grab

Nine years ago (I’ve been doing this a long time) I wrote about the Labour Party harvesting details by hosting a page inviting people to find out “what baby number” they were in relation to the NHS. At that time, no privacy notice information was given at all. Fast forward to today, and Labour is once again hosting a similar page. This time, there is a bit more explanatory information, but it’s far from reassuring.

As an aside, I note that, when a person inputs their date of birth, what the website does is simply calculate, by reference to broad census data, approximately how many babies would have been born since the NHS started and that birth date. So the idea that this gives a “baby number” is ridiculous from the outset.

In any event, the person is then required to give their first name, email address and postcode.

(There is also an odd option to “find out the baby number” of a relative, or friend, by giving that person’s date of birth. Here, the person completing the form is only required to give their own email address and postcode (not their own first name).)

The person completing the form then has the option to agree or not agree to be kept “updated via email on the latest campaigns, events and opportunities to get involved”. This initially seems acceptable when it comes to compliance with the emarketing rules in the Privacy and Electronic Communications (EC Directive) Regulations 2003, so perhaps an improvement on how things were nine years ago. However, in smaller print, the person is then told that “We may use the information you provide, such as name and postcode, to match the data provided to your electoral register record held on our electoral database, which could inform future communications you receive from us”. So it appears that, even if one declines to receive future emails, the party may still try to match one’s details with those on the electoral register and may still send “future communications” (although query how accurate – or even feasible – this will be: how many Johns, say, potentially live in postcode SK9 5AF?).

This suggests that some sort of profiling is going on, but it is all a bit unclear, and opaque, which are not words that really should be associated with the processing of personal data by a political party. But if one clicks the link to “know more about how we use your information” the first thing one encounters is a cookie banner with no option but to accept cookies (which will, it is said, help the party make its website better). Such a banner is, of course, not lawful, and – if the ICO is to be believed – puts the party at current risk of enforcement action. If, teeth gritted, one clicks through the banner, one is faced with a privacy notice which, dear readers, I think needs to be the subject of a further blog (maybe with a comparative analysis of other parties’ notices). Suffice to say that the Labour Party appears to be doing one heck of a lot of profiling, and “estimation” of political opinions, from a range of statutory and/or public information sources.

For now, the TL;DR of this post is that the “NHS Baby Number” schtick from the Labour Party seems to be as much of a (although maybe a different) grubby data grab as it was nine years ago when I last wrote about it. There’s a lot that the ICO could, and should, do about it, but nothing was done then, and – I fear – nothing will be done now.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under fairness, Information Commissioner, PECR, political parties, privacy notice, profiling

ICO “does not use AI” – really?

There’s an interesting Freedom of Information (FOI) response by the Information Commissioner’s Office (ICO) on the website WhatDoTheyKnow. In response to the question

have you examined the use of AI to help you in doing your work as an organisation?

their reply includes the statement that

For information, the ICO does not use any artificial intelligence (“AI”) technology.

However, if one uses most of the standard definitions of AI (such as the one from the government’s National AI Strategy: “machines that perform tasks normally requiring human intelligence, especially when the machines learn from data how to do those tasks”) one might find that hard to believe. What about spam filters on the ICO email network? Or the fact they recommend Google Maps for anyone needing directions to their offices? Or their corporate use of social media? All of those technologies use, or constitute, AI.

There is a wider point here: the task of regulating AI, or even of comprehending how it uses personal data, will fall increasingly on some key regulators in coming years (including the ICO). It is going to be crucial that there is understanding within those organisations of these issues, and if they don’t comprehend now how, within their own walls, the technology operates, they will be starting off on the back foot.

(One should also add that, if the ICO has missed some of its own more obvious uses of AI, then it has probably also failed to respond to the FOI request in accordance with the law.)

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under AI, Freedom of Information, Information Commissioner

ICO: powers to enforce over dead people’s information?

The Information Commissioner’s Office (ICO) has announced that it will not be taking action against Lancashire Police in relation to their disclosure of private information during their investigation into the tragic case of Nicola Bulley.

This is unsurprising, and, objectively, reassuring, because if the ICO had brought enforcement proceedings it would almost certainly have been unlawful to do so. In blunt terms, the ICO’s relevant powers are under laws which deal with “personal data” (data relating to a living individual) and when the police disclosed information about Nicola, she was not living.

There is no discretion in these matters, and no grey areas – a dead person (in the UK, at least) does not have data protection rights because information relating to a dead person is, simply, not personal data. Even if the police thought, at the time of the disclosure, that Nicola was alive, it appears that, as a matter of fact, she was not. (I note that the ICO says it will be able to provide further details about its decision following the inquest into Nicola’s death, so it is just possible that there is further information which might elucidate the position.)

Unless the ICO was going to try to take enforcement action in relation to a general policy, or the operation of a general policy, about disclosure of information about missing people (for instance under Article 24 of the UK GDPR), then there was simply no legal power to take action in respect of this specific incident.

That is not to say that the ICO was not entitled to comment on the general issues, or publish the guidance it has published, but it seems to be either an empty statement to say “we don’t consider this case requires enforcement action”, or a statement that reveals a failure to apply core legal principles to the situation.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, enforcement, Information Commissioner, personal data, police

HMRC sending spam

Have HMRC jumped the gun, and assumed that they can now (in advance of the Data Protection and Digital Information (No.2) Bill being passed) rely on the soft opt-in for email marketing?

In common with many other poor souls, I have in recent years had to submit a self-assessment tax return to HMRC. Let’s just say that, unless they’re going to announce a rebate, I don’t relish hearing from them. So I was rather surprised to receive an email from “HMRC Help and Support” recently, telling me “what’s coming up in May” and inviting me to attend webinars. A snippet of the email is here

This certainly wasn’t solicited. And, at least if you follow the approach of the Information Commissioner’s Office (ICO) was direct marketing by electronic means (“Direct marketing covers the promotion of aims and ideals as well as the sale of products and services. This means that the rules will cover not only commercial organisations but also not-for-profit organisations“).

The only lawful way that a person can send unsolicited direct electronic marketing to an individual subscriber like me, is if the recipient has consented to receive it (I hadn’t), or if the person obtained the contact details of the recipient in the course of the sale or negotiations for the sale of a product or service to that recipient (see regulation 22 of the Privacy and Electronic Marketing (EC Directive) Regulations 2003 (“PECR”)). But HMRC cannot avail themselves of the latter (commonly known as the “soft opt-in”), because they have not sold me (or negotiated with me for the sale) of a product or service. The ICO also deals with this in its guidance: “Not-for-profit organisations should take particular care when communicating by text or email. This is because the ‘soft opt-in’ exception only applies to commercial marketing of products or services“.

I raised a complaint (twice) directly with HMRC’s Data Protection Officer who (in responses that seemed oddly, let’s say, robotic) told me how to unsubscribe, and pointed me to HMRC’s privacy notice.

It seems to me that HMRC might be taking a calculated risk though: the Data Protection and Digital Information (No.2) Bill, currently making its way through Parliament, proposes (at clause 82) to extend the soft opt-in to “non-commercial objectives”. If it passes, then we must expect much more of This Type Of Thing from government.

If I’m correct in this, though, I wonder if, when calculating that calculated risk, HMRC calculated the risk of some calculated individual (me, perhaps) complaining to the ICO?

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection Bill, HMRC, Information Commissioner, marketing, PECR, spam

SRA, data protection and the solicitors roll

In August 2022 the Solicitors Regulation Authority (SRA) announced plans to change its rules and reinstate the annual “keeping of the roll” exercise. Until 2014, all solicitors without practising certificates were required to complete an application each year and pay an administration fee if they wished to remain on the roll. This requirement was dispensed with in 2014 in part because the annual process was seen as burdensome for solicitors.

One of the justifications now for reintroducing the keeping of the roll is given by the SRA as

There are also requirements under the General Data Protection Regulation (GDPR) 2016 [sic] and the seven principles that govern the holding and retention of data. Under GDPR we have responsibility as a data controller to ensure we maintain accurate data relating to individuals and we are processing it fairly and lawfully.

What is slightly odd is that when, in 2014, the SRA proposed to scrap the keeping of the roll, it was not troubled by the observations of the then Information Commissioner about the importance of accuracy and privacy of information. In its reply to the then Commissioner’s consultation response it said that it had “fully considered the issues” and

We consider that the availability of the SRA’s online system, mySRA, to non- practising solicitors as a means of keeping their details up to date, serves to mitigate the possibility of data become inaccurate…To further mitigate the risk of deterioration of the information held on the roll, the SRA can include reminders to keep contact details up to date in standard communications sent to solicitors.

If that was the position in 2014, it is difficult to understand why it is any different today. The data protection principles – including the “accuracy principle” – in the UK GDPR (not in fact the “GDPR 2016” that the SRA refers to) are effectively identical to those in the prior Data Protection Act 1998.

If the SRA was not concerned by data protection considerations in 2014 but is so now, one might argue that it should explain why. The Information Commissioner does not appear to have responded to the consultation this time around, so there is no indication that his views swayed the SRA.

If the SRA was concerned about the risk of administrative fines (potentially larger under the UK GDPR than under the Data Protection Act 1998) it should have reassured itself that any such fines must be proportionate (Article 83(1) UK GDPR) and by the fact that the Commissioner has repeatedly stressed that he is not in the business of handing out fines for minor infringements to otherwise responsible data controllers.

I should emphasise that data protection considerations were not the only ones taken into account by the SRA, and I don’t wish to discuss whether, in the round, the decision to reintroduce the keeping of the roll was correct or not (Joshua Rozenberg has written on this, and the effect on him). But I do feel that the arguments around data protection show a confused approach to that particular issue.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Information Commissioner, Let's Blame Data Protection, UK GDPR

Where’s the Tories’ privacy notice? (just don’t mention the footballer)

The Conservative Party, no doubt scrabbling to gather perceived support for its contentious immigration policies and measures is running a web and social media campaign. The web page encourages those visiting it to “back our plan and send a message” to other parties:

Further down the page visitors are invited to “send Labour a message”

Clicking on either of the red buttons in those screenshots results in a pop-up form, on which one can say whether or not one supports the Tory plans (in the screenshot below, I’ve selected “no”)

One is then required to give one’s name, email address and postcode, and there is a tick box against text saying “I agree to the Conservative Party, and the wider Conservative Party, using the information I provide to keep me updated via email about the Party’s campaigns and opportunities to get involved”

There are two things to note.

First, the form appears to submit whether one ticks the “I agree” box or not.

Second, and in any case, none of the links to “how we use your data”, or the “privacy policy”, or the “terms and conditions” works.

So anyone submitting their special category data (information about one’s views on a political party’s policies on immigration is personal data revealing political opinions, and so Article 9 UK GDPR applies) has no idea whatsoever how it will subsequently be processed by the Tories.

I suppose there is an argument that anyone who happens upon this page, and chooses to submit the form, has a good idea what is going on (although that is by no means certain, and people could quite plausibly think that it provides an opportunity to provide views contrary to the Tories’). In any event, it would seem potentially to meet to definition of “plugging” (political lobbying under the guide of research) which ICO deals with in its direct marketing guidance.

Also in any event, the absence of any workable links to privacy notice information means, unavoidably, that the lawfulness of any subsequent processing is vitiated.

It’s the sort of thing I would hope the ICO is alive to (I’ve seen people on social media saying they have complained to ICO). But I won’t hold my breath on that – many years ago I wrote about how such data abuse was rife across the political spectrum – but little if anything has changed.

And finally, the most remarkable thing of all is that I’ve written a whole post on what is a pressing and high-profile issue without once mentioning Gary Lineker.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, marketing, PECR, privacy notice, social media, spam, UK GDPR