FOIA’s not the only route

News emerges of a potential judicial review attempt to force disclosure of government Brexit papers not under FOI but under common law and human rights to information

More than three years ago the Supreme Court handed down judgment in a long-running piece of litigation under the Freedom of Information Act 2000 (FOIA). Journalist Dominic Kennedy had attempted to get disclosure from the Charity Commission of information relating to inquiries into George Galloway’s “Mariam Appeal”. The Commission said, in effect, that the absolute exemption to disclosure at section 32(2) of FOIA was the end of the story, while Kennedy argued that Article 10 of the European Convention on Human Rights imposed a positive obligation of disclosure on public authorities, particularly when the requester was a “public watchdog” like the press, and that s32(2) should be read down accordingly to require disclosure in the circumstances (I paraphrase). In his leading opinion Lord Mance gave this stirring introduction:

Information is the key to sound decision-making, to accountability and development; it underpins democracy and assists in combatting poverty, oppression, corruption, prejudice and inefficiency. Administrators, judges, arbitrators, and persons conducting inquiries and investigations depend upon it; likewise the press, NGOs and individuals concerned to report on issues of public interest. Unwillingness to disclose information may arise through habits of secrecy or reasons of self-protection. But information can be genuinely private, confidential or sensitive, and these interests merit respect in their own right and, in the case of those who depend on information to fulfil their functions, because this may not otherwise be forthcoming. These competing considerations, and the balance between them, lie behind the issues on this appeal.

What was most interesting about the judgment in Kennedy, and, again, I disrespectfully heavily paraphrase, was that the Supreme Court basically said (as it has been wont to do in recent years) – “why harp on about your rights at European law, don’t you realise that our dear old domestic friend the common law gives you similar rights?”

the route by which [Mr Kennedy] may, after an appropriate balancing exercise, be entitled to disclosure, is not under or by virtue of some process of remodelling of section 32, but is under the Charities Act construed in the light of common law principles and/or in the light of article 10 of the Human Rights Convention, if and so far as that article may be engaged

This greatly excited those in the information rights field at the time, but since then, there has been little of prominence to advance the proposition that FOIA rights are not the only route [Ed. there’s a great/awful pun in there somewhere] but it did get a positive airing in R (Privacy International) v HMRC [2014] EWHC 1475 (Admin) (on which see Panopticon post here).

Yesterday (12 October) barrister Jolyon Maugham announced that his Good Law Project was seeking donors towards a judicial review application if the government refused to publish information and reports comparing the predicted economic harm of Brexit with the predicted economic benefits of alternative free trade agreements. Keen followers of information rights litigation will note that Tim Pitt-Payne  and Robin Hopkins are instructed: the potential respondents should quake in their boots.

Well worth watching this, and well worth – in my opinion – donating towards the cause.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Brexit, Freedom of Information, human rights, Open Justice

Serious DCMS error about consent and data protection 

I blogged on Monday about the government Statement of Intent regarding the forthcoming Data Protection Bill. What I missed at the time was an accompanying release on the Department for Digital, Culture,  Media and Sport (DCMS) website.  Having now seen it, I realise why so many media outlets have been making a profoundly misleading statement about consent under the new data protection law: they have lifted it directly from DCMS. The statement is

The Data Protection Bill will require ‘explicit’ consent to be necessary for processing sensitive personal data

It should only take a second to realise how wrong this is: sensitive personal data will include information about, among other things, health, and criminal convictions. Is the government proposing, say, that, before passing on information about a critically injured patient to an A&E department, a paramedic will have to get the unconscious patient’s explicit consent? Is it proposing that before passing on information about a convicted sex offender to a local authority social care department the Disclosure and Barring Service will have to get the offender’s explicit consent? 

Of course not – it’s absolute nonsense to think so, and the parliamentary drafters of the forthcoming Bill would not dream of writing the law in such a way, not least because it would contravene our obligations under the General Data Protection Regulation (GDPR) around which much of the Bill will be based. GDPR effectively mirrors the existing European Data Protection Directive (given effect in our existing Data Protection Act 1998). Under these laws, there are multiple circumstances under which personal data, and higher-category sensitive personal data can be processed. Consent is one of those. But there are, in Article 9(2) of GDPR, nine other conditions which permit the processing of special category data (the GDPR term used to replicate what is called “sensitive personal data” under existing domestic data protection law), and GDPR affords member states the power to legislate for further conditions.

What the DCMS release should say is that when consent is legitimately relied upon to process sensitive personal data the consent must be explicit. I know that sentence has got more words on it than the DCMS original, but that’s because sometimes a statement needs more words in order to be correct, and make sense, rather than mislead on a very important point regarding people’s fundamental rights.

I tweeted Matt Hancock, the minister, about the error, but with no answer as yet. I’ve also invited DCMS to correct it. The horse has already bolted though, as a Google news search for the offending phrase will show. The Information Commissioner’s Office has begun a series of pieces addressing GDPR myths, and I hope this is one they’ll talk about, but DCMS themselves should still issue a corrective, and soon.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, DCMS, GDPR, Information Commissioner, Uncategorized

DCMS Statement of Intent on the Data Protection Bill

Not so much a Statement of Intent, as a Statement of the Bleeding Obvious

The wait is not quite over. We don’t yet have a Data Protection Bill, but we do have a Statement of Intent from DCMS, explaining what the proposed legislation will contain. I though it would be helpful to do a short briefing note based on my very quick assessment of the Statement. So here it is

IT’S JUST AN ANNOUNCEMENT OF ALL THE THINGS THE UK WOULD HAVE TO IMPLEMENT ANYWAY UNDER EUROPEAN LAW

By which I mean, it proposes law changes which will be happening in May next year, when the General Data Protection Regulation becomes directly applicable, or changes made under our obligation to implement the Police and Crime Directive. In a little more detail, here are some things of passing interest, none of which is hugely unexpected.

As predicted by many, at page 8 it is announced that the UK will legislate to require parents to give consent to children’s access to information society services (i.e. online services) where the child is under 13 (rather than GDPR’s default 16). As the UK lobbied to give member states discretion on this, it is no surprise.

Exemptions from compliance with majority of data protection law when the processing is for the purposes of journalism will remain (page 19). The Statement says that the government

believe the existing exemptions set out in section 32 strike the right balance between privacy and freedom of expression

But of potential note is the suggestion that

The main difference will be to amend provisions relating to the ICO’s enforcement powers to strengthen the ICO’s ability to enforce the re-enacted section 32 exemptions effectively

Without further details it is impossible to know what will be proposed here, but any changes to the existing regime which might have the effect of decreasing the size of the media’s huge carve-out will no doubt be vigorously lobbied against.

There is confirmation (at pp17 and 18) that third parties (i.e. not just criminal justice bodies) will be able to access criminal conviction information. Again, this is not unexpected – the regime for criminal records checks for employers etc was unlikely to be removed.

The Statement proposes a new criminal offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data, something the Commons Science and Technology Committee has called for. Those who subsequently process such data will also be guilty of an offence. The details here will be interesting to see – as with most privacy-enhancing technology, in order for anonymisation to be robust it needs to stress-tested – such testing will not be effective if those undertaking do so at risk of committing an offence, so presumably the forthcoming Bill will provide for this.

The Bill will also introduce an offence of altering records with intent to prevent disclosure following a subject access request. This will use the current mechanism at section 77 of the Freedom of Information Act 2000. Whether that section itself will be amended (time limits for prosecutions militate against its effectiveness) remains unknown.

I also note that the existing offence of unlawfully obtaining personal data will be widened to those who retain personal data against the wishes of the data controller, even where it was initially obtained lawfully. This will probably cover those situations where people gather or are sent personal data in error, and then refuse to return it.

There is one particular howler at page 21, which suggests the government doesn’t understand what privacy by design and privacy by default mean:

The Bill will also set out to reassure citizens by promoting the concept of “privacy by default and design”. This is achieved by giving citizens the right to know when their personal data has been released in contravention of the data protection safeguards, and also by offering them a clearer right of redress

Privacy by design/default is about embedding privacy protection throughout the lifecycle of a project or process etc., and has got nothing at all to do with notifying data subjects of breaches, and whether this is a drafting error in the Statement, or a fundamental misunderstanding, it is rather concerning that the government, which makes much of “innovation” (around which privacy by design should be emphasised), fails to get this right.

So that’s a whistle stop tour of the Statement, ignoring all the fluff about implementing things which are required under GDPR and the Directive. I’ll update this piece in due course, if anything else emerges from a closer reading.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

10 Comments

Filed under Data Protection, GDPR, Information Commissioner, journalism

GDPR could cost Rotherham man more than the world’s entire money

In rather shocking news I can reveal that Roy Flynn, 58, of Windsor Road, Wath-upon-Dearne, is potentially facing fines of more than £60 trillion, under the EU General Data Protection Regulation. 

The regulations, which will become law next May, and will require consent for everything anyone does ever, leave data controllers liable for fines of €20 million every time they are breached. 

Mr Flynn is known to be an active social media user, and a member of several local clubs, including the Wombwell Top Gear Appreciation Society, the Mexborough Real Ale Club and the Brampton Bierlow Fat Men on Expensive Bicycles Group. He regularly makes personal comments about people on web articles, posts Facebook updates about fellow members of these societies and repeatedly fails to use “blind copy” when sending group emails. It has also been reported that he uses an unencrypted Dell Inspiron laptop with anti-virus software that was last updated in August 2007.

Cyber security experts are now warning Mr Flynn that unless he downloads their GDPR White Paper and purchases their unique data discovery tool he will be liable for fines in excess of the total amount of money in the entire world. It is being suggested that this could cause significant disruption to his community activities.

However, when contacted by the author Mr Flynn would only comment “Bugger off you soft Southern weirdo”. 

The Information Commissioner’s Office has said “we are aware of this incident and are making enquiries”. We expect to hear the outcome of these enquiries within the next decade.

For similar news see here, here, here, here etc 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under GDPR, satire

On some sandy beach

[EDITED 25.07.17 to include references to “sandpits” in the report of the Deepmind Health Independent Review Panel]

What lies behind the Information Commissioner’s recent reference to “sandbox regulation”?

The government minister with responsibility for data protection, Matt Hancock, recently spoke to the Leverhulme Centre. He touched on data protection:

a new Data Protection Bill in this Parliamentary Session…will bring the laws up to date for the modern age, introduce new safeguards for citizens, stronger penalties for infringement, and important new features like the right to be forgotten. It will bring the EU’s GDPR and Law Enforcement Directive into UK law, ensuring we are prepared for Brexit.

All pretty standard stuff (let’s ignore the point that the “right to be forgotten” such as it is, exists under existing law – a big clue to this being that the landmark case was heard by the CJEU in 2014). But Hancock went on to cite with approval some recent words of the Information Commissioner, Elizabeth Denham:

I think the ICO’s proposal of a data regulatory “sandbox” approach is very impressive and forward looking. It works in financial regulation and I look forward to seeing it in action here.

This refers to Denham’s recent speech on “Promoting privacy with innovation within the law”, in which she said

We are…looking at how we might be able to engage more deeply with companies as they seek to implement privacy by design…How we can contribute to a “safe space” by building a sandbox where companies can test their ideas, services and business models. How we can better recognise the circular rather than linear nature of the design process.

I thought this was interesting – “sandbox regulation” in the financial services sector involves an application to the Financial Conduct Authority (FCA), for the testing of “innovative” products that don’t necessarily fit into existing regulatory frameworks – the FCA will even where necessary waive rules, and undertake not to take enforcement action.

That this model works for financial services does not, though, necessarily mean it would work when it comes to regulation of laws, such as data protection laws, which give effect to fundamental rights. When I made enquiries to the Information Commissioner’s Office (ICO) for further guidance on what Denham intends, I was told that they “don’t have anything to add to what [she’s] already said about engaging with companies to help implement privacy by design”.

The recent lack of enforcement action by the ICO against the Royal Free NHS Trust regarding its deal with Google Deepmind raised eyebrows in some circles: if the unlawful processing of 1.6 million health records (by their nature sensitive personal data) doesn’t merit formal enforcement, then does anything?

Was that a form of “sandbox regulation”? Presumably not, as it doesn’t appear that the ICO was aware of the arrangement prior to it taking place, but if, as it seems to me, such regulation may involve a light-touch approach where innovation is involved, I really hope that the views and wishes of data subjects are not ignored. If organisations are going to play in the sand with our personal data, we should at the very least know about it.

**EDIT: I have had my attention drawn to references to “sandpits” in the Annual Report of the Deepmind Health Independent Review Panel:

We think it would be helpful if there was a space, similar to the ‘sandpits’ established by the Research Councils, which would allow regulators, the Department of Health and tech providers to discuss these issues at an early stage of product development. The protection of data during testing is an issue that should be discussed in a similar collaborative forum. We believe that there must be a mechanism that allows effective testing without compromising confidential patient information.

It would seem a bit of a coincidence that this report should be published around the same time Denham and Hancock were making their speeches – and I would argue that this only bolsters the case for more transparency from the ICO about how this type of collaborative regulation will take place.

And I notice that the Review Panel say nothing about involving data subjects in “product development”. Until “innovators” understand that data subjects are the key stakeholder in this, I don’t hold out much hope for the proper protection of rights.**

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, enforcement, human rights, Information Commissioner

An enforcement gap?

ICO wants 200 more staff for GDPR , but its Board think there’s a risk it will instead be losing them

The General Data Protection Regulation (GDPR) is, without doubt, a major reconfiguring of European data protection law. And quite rightly, in the lead-up to its becoming fully applicable on 25 May next year, most organisations are considering how best they can comply with its obligations, and, where necessary, effecting changes to achieve that compliance. As altruistic as some organisations are, a major driver for most is the fear that, under GDPR, regulatory sanctions can be severe. Regulators (in the UK this is the Information Commissioner’s Office (ICO)) will retain powers to force organisations to do, or to stop, something (equivalent to an enforcement notice under our current Data Protection Act 1998 (DPA)), but they will also have the power to levy civil administrative fines of up to €20 million, or 4% of annual global turnover. Much media coverage has, understandably, if misleadingly, focused on these increased “fining” powers (the maximum monetary under the DPA is £500,000). I use the word “misleadingly”, because it is by no means clear that regulators will use the full fining powers available to them: GDPR provides regulators with many other options (see Article 58) and recital 129 in particular states that measures taken should be

appropriate, necessary and proportionate in view of ensuring compliance with this Regulation [emphasis added]

Commentators stressing the existence of these potentially huge administrative fines should be referred to these provisions of GDPR. 

But in the UK, at least, another factor has to be born in mind, and that is the regulator’s capacity to effectively enforce the law. In March this year, the Information Commissioner herself, Elizabeth Denham, told the House of Lords EU Home Affairs Sub-Committee that with the advent of GDPR she was going to need more resource

With the coming of the General Data Protection Regulation we will have more responsibilities, we will have new enforcement powers. So we are putting in new measures to be able to address our new regulatory powers…We have given the government an estimate that we will need a further 200 people in order to be able to do the job.

Those who rather breathlessly reported this with headlines such as “watchdog to hire hundreds more staff” seem to have forgotten the old parental adage of “I want doesn’t always get”. For instance, I want a case of ’47 Cheval Blanc delivered to my door by January Jones, but I’m not planning a domestic change programme around the possibility.

In fact, the statement by Denham might fall into a category best described as “aspirational”, or even “pie in the sky”, when one notes that the ICO Management Board recently received an item on corporate risk, the minutes from which state that

Concern was expressed about the risk of losing staff as GDPR implementation came closer. There remained a risk that the ICO might lose staff in large numbers, but to-date the greater risk was felt to be that the ICO could lose people in particular roles who, because of their experience, were especially hard to replace.

The ICO has long been based in the rather upmarket North West town of Wilmslow (the detailed and parochial walking directions from the railway station to the office have always rather amused me). There is going to be a limited pool of quality candidates there, and ICO pays poorly: current vacancies show case officers being recruited at starting salary of £19,527, and I strongly suspect case officers are the sort of extra staff Denham is looking at.

If ICO is worried about GDPR being a risk to staff retention (no doubt on the basis that better staff will get poached by higher paying employers, keen to have people on board with relevant regulatory experience), and apparently can’t pay a competitive wage, how on earth is it going to retain (or replace) them, and then recruit 200 more, from those sleepy Wilmslow recruitment fairs?

I write this blogpost, I should stress, not in order to mock or criticise Denham’s aspirations – she is absolutely right to want more staff, and to highlight the fact to Westminster. Rather, I write it because I agree with her, and because, unless someone stumps up some significant funding, I fear that the major privacy benefits that GDPR should bring for individuals (and the major sanctions against organisations for serious non-compliance) will not be realised.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, enforcement, GDPR, Information Commissioner, Uncategorized

Making even more criminals

Norfolk Police want your dashcam footage. Do you feel lucky, punk?

I wrote recently about the change to the Information Commissioner’s Office (ICO) registration process, which enables domestic users of CCTV to notify the ICO of that fact, and pay the requisite fee of £35. I noted that this meant that

it is the ICO’s apparent view that if you use CCTV in your household and capture footage outside the boundaries of your property, you are required to register this fact publicly with them, and pay a £35 fee. The clear implication, in fact the clear corollary, is that failure to do so is a criminal offence.

I didn’t take issue with the correctness of the legal position, but I went on to say that

The logical conclusion…here is that anyone who takes video footage anywhere outside their home must register

I even asked the ICO, via Twitter, whether users of dashcams should also register, to which I got the reply

If using dashcam to process personal data for purposes not covered by domestic exemption then would need to comply with [the Data Protection Act 1998]

This subject was moved from the theoretical to the real today, with news that Norfolk Constabulary are encouraging drivers using dashcams to send them footage of “driving offences witnessed by members of the public”.

Following the analyses of the courts, and the ICO, as laid out here and in my previous post, such usage cannot avail itself of the exemption from notification for processing of personal data “only” for domestic purposes, so one must conclude that drivers targeted by Norfolk Constabulary should notify, and pay a £35 fee.

At this rate, the whole of the nation would eventually notify. Fortunately (or not) the General Data Protection Regulation becomes directly applicable from May next year. It will remove the requirement to give notification of processing. Those wishing, then, to avoid the opprobrium of being a common criminal have ten months to send their fee to the ICO. Others might question how likely it is that the full force of the law will discover their criminality, and prosecute, in that short time period.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, Information Commissioner, police, Uncategorized

Data Protection (and other) compensation awarded against Ombudsman

I’ve been helpfully referred to a rather remarkable judgment of the Leeds County Court, in a claim for damages against the Local Government Ombudsman for, variously, declaratory relief and damages arising from discrimination under the Equality Act 2010, and breach of the Data Protection Act 1998 (DPA). The claim was resoundingly successful, and led to a total award of £12,500, £2,500 of which were aggravated damages because of the conduct of the trial by the respondent.

The judgment has been uploaded to Dropbox here.

I will leave readers to draw their own conclusions about the actions of the Ombudsman, but it’s worth noting, when one reads the trenchant criticism by District Judge Geddes, that one of the office’s strategic objectives is to

deliver effective redress through impartial, rigorous and proportionate investigations

One can only conclude that, in this case at least, this objective was very far from met.

Of particular relevance for this blog, though, was the award of £2500 for distress arising from failure to prepare and keep an accurate case file recording the disability of the claimant and her daughter. This, held the District Judge, was a contravention of the Ombudsman’s obligations under the DPA. As is now relatively well known, the DPA’s original drafting precluded compensation for distress alone (in the absence of tangible – e.g. financial – damage), but the Court of Appeal, in Vidal Hall & ors v Google ([2015] EWCA Civ 311), held that this was contrary to the provisions of the Charter of Fundamental Rights of the European Union and that, accordingly, there was a right under the DPA to claim compensation for “pure” distress. The award in question here was of “Vidal Hall” compensation, with the judge saying there was

no doubt in my mind that the data breaches have caused distress to the claimant in their own rights as well as as a result of the consequences that flowed.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under 7th principle, accuracy, Data Protection, human rights, local government

Making criminals of us all

The Information Commissioner thinks that countless households operating CCTV systems need to register this, and pay a £35 fee for doing so. If they don’t, they might be committing a crime. The Commissioner is probably mostly correct, but it’s a bit more complex than that, for reasons I’ll explain in this post.

Back in 2014, to the surprise of no one who had thought about the issues, the Court of Justice of the European Union (CJEU) held that use of domestic CCTV to capture footage of identifiable individuals in public areas could not attract the exemption at Article 3(2) of the European data protection directive for processing of personal data

by a natural person in the course of a purely personal or household activity

Any use of CCTV, said the CJEU, for the protection of a house or its occupiers but which also captures people in a public space is thus subject to the remaining provisions of the directive:

the operation of a camera system, as a result of which a video recording of people is stored on a continuous recording device such as a hard disk drive, installed by an individual on his family home for the purposes of protecting the property, health and life of the home owners, but which also monitors a public space, does not amount to the processing of data in the course of a purely personal or household activity, for the purposes of that provision

As some commentators pointed out at the time, the effect of this ruling was potentially to place not just users of domestic CCTV systems under the ambit of data protection law, but also, say, car drivers using dashcams, cyclists using helmetcams, and many other people using image recording devices in public for anything but their own domestic purposes.

Under the directive, and the UK Data Protection Act 1998, any data controller processing personal data without an exemption (such as the one for purely personal or household activity) must register the fact with the relevant supervisory authority, which in the UK is the Information Commissioner’s Office (ICO). Failure to register in circumstances under which a data controller should register is a criminal offence punishable by a fine. There is a two-tier fee for making an entry in the ICO’s register, set at £35 for most data controllers, and £500 for larger ones.

For some time the ICO has advised corporate data controllers that if they use CCTV on their premises they will need to register:


But I recently noticed that the registration page itself had changed, and that there is now a separate button to register “household CCTV”


If one clicks that button one is taken to a page which informs that, indeed, a £35 fee is payable, and that the information provided will be published online 


There is a link to the ICO’s overarching privacy notice [ed. you’re going to have to tighten that up for GDPR, guys] but the only part of that notice which talks about the registration process relates only to “businesses”


Continuing the household CCTV registration process, one then gets to the main screen, which requires that the responsible person in the household identify themselves as data controller, and give either their household or email address for publication


What this all means is that it is the ICO’s apparent view that if you use CCTV in your household and capture footage outside the boundaries of your property, you are required to register this fact publicly with them, and pay a £35 fee. The clear implication, in fact the clear corollary, is that failure to do so is a criminal offence.

(In passing, there is a problem here: the pages and the process miss the point that for the registration to be required, the footage needs to be capturing images of identifiable individuals, otherwise no personal data is being processed, and data protection law is simply not engaged. What if someone has installed a “nest cam” in a nearby wooded area? Is ICO saying they are committing a criminal offence if they fail to register this? Also, what if the footage does capture identifiable individuals outside the boundaries of a household, but the footage is only taken for household, rather than crime reduction purposes? The logical conclusion of the ICO pages here is that anyone who takes video footage anywhere outside their home must register, which contradicts their guidance elsewhere.)

What I find particularly surprising about all this is that, although fundamentally it is correct as a matter of law (following the Ryneš decision by the CJEU), I have seen no publicity from the ICO about this pretty enormous policy change. Imagine how many households potentially *should* register, and how many won’t? And, therefore, how many the ICO is implying are committing a criminal offence?

And one thing that is really puzzling me is why this change, now? The CJEU ruling was thirty months ago, and in another eleven months, European data protection law will change, removing – in the UK also – the requirement to register in these circumstances. If it was so important for the ICO to effect these changes before then, why keep it quiet?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, Directive 95/46/EC, GDPR, Information Commissioner, Uncategorized

Public houses, private comms

Wetherspoons delete their entire customer email database. Deliberately.

In a very interesting development, the pub chain JD Wetherspoon have announced that they are ceasing sending monthly newsletters by email, and are deleting their database of customer email addresses.

Although the only initial evidence of this was the screenshot of the email communication (above), the company have confirmed to me on their Twitter account that the email is genuine.

Wetherspoons say the reason for the deletion is that they feel that email marketing of this kind is “too intrusive”, and that, instead of communicating marketing by email, they will “continue to release news stories on [their] website” and customers will be able to keep up to date by following them on Facebook and Twitter.

This is interesting for a couple of reasons. Firstly, companies such as Flybe and Honda have recently discovered that an email marketing database can be a liability if it is not clear whether the customers in question have consented to receive marketing emails (which is a requirement under the Privacy and Electronic Communications ((EC Directive) Regulations 2003 (PECR)). In March Flybe received a monetary penalty of £70,000 from the Information Commissioner’s Office (ICO) after sending more than 3.3 million emails with the title ‘Are your details correct?’ to people who had previously told them they didn’t want to receive marketing emails. These, said the ICO, were themselves marketing emails, and the sending of them was a serious contravention of PECR. Honda, less egregiously, sent 289,790 emails when they did not know whether or not the recipients had consented to receive marketing emails. This also, said ICO, was unlawful marketing, as the burden of proof was on Honda to show that they had recipients’ consent to send the emails, and they could not. The result was a £13,000 monetary penalty.

There is no reason to think Wetherspoons were concerned about the data quality (in terms of whether people had consented to marketing) of their own email marketing database, but it is clear from the Flybe and Honda cases that a bloated database with email details of people who have not consented to marketing (or where it is unclear whether they have) is potentially a liability under PECR (and related data protection law). It is a liability both because any marketing emails sent are likely to be unlawful (and potentially attract a monetary penalty) but also because, if it cannot be used for marketing, what purpose does it serve? If none, then it constitutes a huge amount of personal data, held for no ostensible purpose, which would be in contravention of the fifth principle in schedule 1 to the Data Protection Act 1998.

For this reason, I can understand why some companies might take a commercial and risk-based decision not to retain email databases – if something brings no value, and significant risk, then why keep it?

But there is another reason Wetherspoons’ rationale is interesting: they are clearly aiming now to use social media channels to market their products. Normally, one thinks of advertising on social media as not aimed at or delivered to individuals, but as technology has advanced, so has the ability for social media marketing to become increasingly targeted. In May this year it was announced that the ICO were undertaking “a wide assessment of the data-protection risks arising from the use of data analytics”. This was on the back of reports that adverts on Facebook were being targeted by political groups towards people on the basis of data scraped from Facebook and other social media. Although we don’t know what the outcome of this investigation by the ICO will be (and I understand some of the allegations are strongly denied by entities alleged to be involved) what it does show is that stopping your e-marketing on one channel won’t necessarily stop you having privacy and data protection challenges on another.

And that’s before we even get on to the small fact that European ePrivacy law is in the process of being rewritten. Watch that space.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, marketing, monetary penalty notice, PECR, social media, spam