Category Archives: Information Commissioner

Serious DCMS error about consent and data protection 

I blogged on Monday about the government Statement of Intent regarding the forthcoming Data Protection Bill. What I missed at the time was an accompanying release on the Department for Digital, Culture,  Media and Sport (DCMS) website.  Having now seen it, I realise why so many media outlets have been making a profoundly misleading statement about consent under the new data protection law: they have lifted it directly from DCMS. The statement is

The Data Protection Bill will require ‘explicit’ consent to be necessary for processing sensitive personal data

It should only take a second to realise how wrong this is: sensitive personal data will include information about, among other things, health, and criminal convictions. Is the government proposing, say, that, before passing on information about a critically injured patient to an A&E department, a paramedic will have to get the unconscious patient’s explicit consent? Is it proposing that before passing on information about a convicted sex offender to a local authority social care department the Disclosure and Barring Service will have to get the offender’s explicit consent? 

Of course not – it’s absolute nonsense to think so, and the parliamentary drafters of the forthcoming Bill would not dream of writing the law in such a way, not least because it would contravene our obligations under the General Data Protection Regulation (GDPR) around which much of the Bill will be based. GDPR effectively mirrors the existing European Data Protection Directive (given effect in our existing Data Protection Act 1998). Under these laws, there are multiple circumstances under which personal data, and higher-category sensitive personal data can be processed. Consent is one of those. But there are, in Article 9(2) of GDPR, nine other conditions which permit the processing of special category data (the GDPR term used to replicate what is called “sensitive personal data” under existing domestic data protection law), and GDPR affords member states the power to legislate for further conditions.

What the DCMS release should say is that when consent is legitimately relied upon to process sensitive personal data the consent must be explicit. I know that sentence has got more words on it than the DCMS original, but that’s because sometimes a statement needs more words in order to be correct, and make sense, rather than mislead on a very important point regarding people’s fundamental rights.

I tweeted Matt Hancock, the minister, about the error, but with no answer as yet. I’ve also invited DCMS to correct it. The horse has already bolted though, as a Google news search for the offending phrase will show. The Information Commissioner’s Office has begun a series of pieces addressing GDPR myths, and I hope this is one they’ll talk about, but DCMS themselves should still issue a corrective, and soon.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, DCMS, GDPR, Information Commissioner, Uncategorized

DCMS Statement of Intent on the Data Protection Bill

Not so much a Statement of Intent, as a Statement of the Bleeding Obvious

The wait is not quite over. We don’t yet have a Data Protection Bill, but we do have a Statement of Intent from DCMS, explaining what the proposed legislation will contain. I though it would be helpful to do a short briefing note based on my very quick assessment of the Statement. So here it is

IT’S JUST AN ANNOUNCEMENT OF ALL THE THINGS THE UK WOULD HAVE TO IMPLEMENT ANYWAY UNDER EUROPEAN LAW

By which I mean, it proposes law changes which will be happening in May next year, when the General Data Protection Regulation becomes directly applicable, or changes made under our obligation to implement the Police and Crime Directive. In a little more detail, here are some things of passing interest, none of which is hugely unexpected.

As predicted by many, at page 8 it is announced that the UK will legislate to require parents to give consent to children’s access to information society services (i.e. online services) where the child is under 13 (rather than GDPR’s default 16). As the UK lobbied to give member states discretion on this, it is no surprise.

Exemptions from compliance with majority of data protection law when the processing is for the purposes of journalism will remain (page 19). The Statement says that the government

believe the existing exemptions set out in section 32 strike the right balance between privacy and freedom of expression

But of potential note is the suggestion that

The main difference will be to amend provisions relating to the ICO’s enforcement powers to strengthen the ICO’s ability to enforce the re-enacted section 32 exemptions effectively

Without further details it is impossible to know what will be proposed here, but any changes to the existing regime which might have the effect of decreasing the size of the media’s huge carve-out will no doubt be vigorously lobbied against.

There is confirmation (at pp17 and 18) that third parties (i.e. not just criminal justice bodies) will be able to access criminal conviction information. Again, this is not unexpected – the regime for criminal records checks for employers etc was unlikely to be removed.

The Statement proposes a new criminal offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data, something the Commons Science and Technology Committee has called for. Those who subsequently process such data will also be guilty of an offence. The details here will be interesting to see – as with most privacy-enhancing technology, in order for anonymisation to be robust it needs to stress-tested – such testing will not be effective if those undertaking do so at risk of committing an offence, so presumably the forthcoming Bill will provide for this.

The Bill will also introduce an offence of altering records with intent to prevent disclosure following a subject access request. This will use the current mechanism at section 77 of the Freedom of Information Act 2000. Whether that section itself will be amended (time limits for prosecutions militate against its effectiveness) remains unknown.

I also note that the existing offence of unlawfully obtaining personal data will be widened to those who retain personal data against the wishes of the data controller, even where it was initially obtained lawfully. This will probably cover those situations where people gather or are sent personal data in error, and then refuse to return it.

There is one particular howler at page 21, which suggests the government doesn’t understand what privacy by design and privacy by default mean:

The Bill will also set out to reassure citizens by promoting the concept of “privacy by default and design”. This is achieved by giving citizens the right to know when their personal data has been released in contravention of the data protection safeguards, and also by offering them a clearer right of redress

Privacy by design/default is about embedding privacy protection throughout the lifecycle of a project or process etc., and has got nothing at all to do with notifying data subjects of breaches, and whether this is a drafting error in the Statement, or a fundamental misunderstanding, it is rather concerning that the government, which makes much of “innovation” (around which privacy by design should be emphasised), fails to get this right.

So that’s a whistle stop tour of the Statement, ignoring all the fluff about implementing things which are required under GDPR and the Directive. I’ll update this piece in due course, if anything else emerges from a closer reading.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

10 Comments

Filed under Data Protection, GDPR, Information Commissioner, journalism

On some sandy beach

[EDITED 25.07.17 to include references to “sandpits” in the report of the Deepmind Health Independent Review Panel]

What lies behind the Information Commissioner’s recent reference to “sandbox regulation”?

The government minister with responsibility for data protection, Matt Hancock, recently spoke to the Leverhulme Centre. He touched on data protection:

a new Data Protection Bill in this Parliamentary Session…will bring the laws up to date for the modern age, introduce new safeguards for citizens, stronger penalties for infringement, and important new features like the right to be forgotten. It will bring the EU’s GDPR and Law Enforcement Directive into UK law, ensuring we are prepared for Brexit.

All pretty standard stuff (let’s ignore the point that the “right to be forgotten” such as it is, exists under existing law – a big clue to this being that the landmark case was heard by the CJEU in 2014). But Hancock went on to cite with approval some recent words of the Information Commissioner, Elizabeth Denham:

I think the ICO’s proposal of a data regulatory “sandbox” approach is very impressive and forward looking. It works in financial regulation and I look forward to seeing it in action here.

This refers to Denham’s recent speech on “Promoting privacy with innovation within the law”, in which she said

We are…looking at how we might be able to engage more deeply with companies as they seek to implement privacy by design…How we can contribute to a “safe space” by building a sandbox where companies can test their ideas, services and business models. How we can better recognise the circular rather than linear nature of the design process.

I thought this was interesting – “sandbox regulation” in the financial services sector involves an application to the Financial Conduct Authority (FCA), for the testing of “innovative” products that don’t necessarily fit into existing regulatory frameworks – the FCA will even where necessary waive rules, and undertake not to take enforcement action.

That this model works for financial services does not, though, necessarily mean it would work when it comes to regulation of laws, such as data protection laws, which give effect to fundamental rights. When I made enquiries to the Information Commissioner’s Office (ICO) for further guidance on what Denham intends, I was told that they “don’t have anything to add to what [she’s] already said about engaging with companies to help implement privacy by design”.

The recent lack of enforcement action by the ICO against the Royal Free NHS Trust regarding its deal with Google Deepmind raised eyebrows in some circles: if the unlawful processing of 1.6 million health records (by their nature sensitive personal data) doesn’t merit formal enforcement, then does anything?

Was that a form of “sandbox regulation”? Presumably not, as it doesn’t appear that the ICO was aware of the arrangement prior to it taking place, but if, as it seems to me, such regulation may involve a light-touch approach where innovation is involved, I really hope that the views and wishes of data subjects are not ignored. If organisations are going to play in the sand with our personal data, we should at the very least know about it.

**EDIT: I have had my attention drawn to references to “sandpits” in the Annual Report of the Deepmind Health Independent Review Panel:

We think it would be helpful if there was a space, similar to the ‘sandpits’ established by the Research Councils, which would allow regulators, the Department of Health and tech providers to discuss these issues at an early stage of product development. The protection of data during testing is an issue that should be discussed in a similar collaborative forum. We believe that there must be a mechanism that allows effective testing without compromising confidential patient information.

It would seem a bit of a coincidence that this report should be published around the same time Denham and Hancock were making their speeches – and I would argue that this only bolsters the case for more transparency from the ICO about how this type of collaborative regulation will take place.

And I notice that the Review Panel say nothing about involving data subjects in “product development”. Until “innovators” understand that data subjects are the key stakeholder in this, I don’t hold out much hope for the proper protection of rights.**

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, enforcement, human rights, Information Commissioner

An enforcement gap?

ICO wants 200 more staff for GDPR , but its Board think there’s a risk it will instead be losing them

The General Data Protection Regulation (GDPR) is, without doubt, a major reconfiguring of European data protection law. And quite rightly, in the lead-up to its becoming fully applicable on 25 May next year, most organisations are considering how best they can comply with its obligations, and, where necessary, effecting changes to achieve that compliance. As altruistic as some organisations are, a major driver for most is the fear that, under GDPR, regulatory sanctions can be severe. Regulators (in the UK this is the Information Commissioner’s Office (ICO)) will retain powers to force organisations to do, or to stop, something (equivalent to an enforcement notice under our current Data Protection Act 1998 (DPA)), but they will also have the power to levy civil administrative fines of up to €20 million, or 4% of annual global turnover. Much media coverage has, understandably, if misleadingly, focused on these increased “fining” powers (the maximum monetary under the DPA is £500,000). I use the word “misleadingly”, because it is by no means clear that regulators will use the full fining powers available to them: GDPR provides regulators with many other options (see Article 58) and recital 129 in particular states that measures taken should be

appropriate, necessary and proportionate in view of ensuring compliance with this Regulation [emphasis added]

Commentators stressing the existence of these potentially huge administrative fines should be referred to these provisions of GDPR. 

But in the UK, at least, another factor has to be born in mind, and that is the regulator’s capacity to effectively enforce the law. In March this year, the Information Commissioner herself, Elizabeth Denham, told the House of Lords EU Home Affairs Sub-Committee that with the advent of GDPR she was going to need more resource

With the coming of the General Data Protection Regulation we will have more responsibilities, we will have new enforcement powers. So we are putting in new measures to be able to address our new regulatory powers…We have given the government an estimate that we will need a further 200 people in order to be able to do the job.

Those who rather breathlessly reported this with headlines such as “watchdog to hire hundreds more staff” seem to have forgotten the old parental adage of “I want doesn’t always get”. For instance, I want a case of ’47 Cheval Blanc delivered to my door by January Jones, but I’m not planning a domestic change programme around the possibility.

In fact, the statement by Denham might fall into a category best described as “aspirational”, or even “pie in the sky”, when one notes that the ICO Management Board recently received an item on corporate risk, the minutes from which state that

Concern was expressed about the risk of losing staff as GDPR implementation came closer. There remained a risk that the ICO might lose staff in large numbers, but to-date the greater risk was felt to be that the ICO could lose people in particular roles who, because of their experience, were especially hard to replace.

The ICO has long been based in the rather upmarket North West town of Wilmslow (the detailed and parochial walking directions from the railway station to the office have always rather amused me). There is going to be a limited pool of quality candidates there, and ICO pays poorly: current vacancies show case officers being recruited at starting salary of £19,527, and I strongly suspect case officers are the sort of extra staff Denham is looking at.

If ICO is worried about GDPR being a risk to staff retention (no doubt on the basis that better staff will get poached by higher paying employers, keen to have people on board with relevant regulatory experience), and apparently can’t pay a competitive wage, how on earth is it going to retain (or replace) them, and then recruit 200 more, from those sleepy Wilmslow recruitment fairs?

I write this blogpost, I should stress, not in order to mock or criticise Denham’s aspirations – she is absolutely right to want more staff, and to highlight the fact to Westminster. Rather, I write it because I agree with her, and because, unless someone stumps up some significant funding, I fear that the major privacy benefits that GDPR should bring for individuals (and the major sanctions against organisations for serious non-compliance) will not be realised.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, enforcement, GDPR, Information Commissioner, Uncategorized

Making even more criminals

Norfolk Police want your dashcam footage. Do you feel lucky, punk?

I wrote recently about the change to the Information Commissioner’s Office (ICO) registration process, which enables domestic users of CCTV to notify the ICO of that fact, and pay the requisite fee of £35. I noted that this meant that

it is the ICO’s apparent view that if you use CCTV in your household and capture footage outside the boundaries of your property, you are required to register this fact publicly with them, and pay a £35 fee. The clear implication, in fact the clear corollary, is that failure to do so is a criminal offence.

I didn’t take issue with the correctness of the legal position, but I went on to say that

The logical conclusion…here is that anyone who takes video footage anywhere outside their home must register

I even asked the ICO, via Twitter, whether users of dashcams should also register, to which I got the reply

If using dashcam to process personal data for purposes not covered by domestic exemption then would need to comply with [the Data Protection Act 1998]

This subject was moved from the theoretical to the real today, with news that Norfolk Constabulary are encouraging drivers using dashcams to send them footage of “driving offences witnessed by members of the public”.

Following the analyses of the courts, and the ICO, as laid out here and in my previous post, such usage cannot avail itself of the exemption from notification for processing of personal data “only” for domestic purposes, so one must conclude that drivers targeted by Norfolk Constabulary should notify, and pay a £35 fee.

At this rate, the whole of the nation would eventually notify. Fortunately (or not) the General Data Protection Regulation becomes directly applicable from May next year. It will remove the requirement to give notification of processing. Those wishing, then, to avoid the opprobrium of being a common criminal have ten months to send their fee to the ICO. Others might question how likely it is that the full force of the law will discover their criminality, and prosecute, in that short time period.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, Information Commissioner, police, Uncategorized

Making criminals of us all

The Information Commissioner thinks that countless households operating CCTV systems need to register this, and pay a £35 fee for doing so. If they don’t, they might be committing a crime. The Commissioner is probably mostly correct, but it’s a bit more complex than that, for reasons I’ll explain in this post.

Back in 2014, to the surprise of no one who had thought about the issues, the Court of Justice of the European Union (CJEU) held that use of domestic CCTV to capture footage of identifiable individuals in public areas could not attract the exemption at Article 3(2) of the European data protection directive for processing of personal data

by a natural person in the course of a purely personal or household activity

Any use of CCTV, said the CJEU, for the protection of a house or its occupiers but which also captures people in a public space is thus subject to the remaining provisions of the directive:

the operation of a camera system, as a result of which a video recording of people is stored on a continuous recording device such as a hard disk drive, installed by an individual on his family home for the purposes of protecting the property, health and life of the home owners, but which also monitors a public space, does not amount to the processing of data in the course of a purely personal or household activity, for the purposes of that provision

As some commentators pointed out at the time, the effect of this ruling was potentially to place not just users of domestic CCTV systems under the ambit of data protection law, but also, say, car drivers using dashcams, cyclists using helmetcams, and many other people using image recording devices in public for anything but their own domestic purposes.

Under the directive, and the UK Data Protection Act 1998, any data controller processing personal data without an exemption (such as the one for purely personal or household activity) must register the fact with the relevant supervisory authority, which in the UK is the Information Commissioner’s Office (ICO). Failure to register in circumstances under which a data controller should register is a criminal offence punishable by a fine. There is a two-tier fee for making an entry in the ICO’s register, set at £35 for most data controllers, and £500 for larger ones.

For some time the ICO has advised corporate data controllers that if they use CCTV on their premises they will need to register:


But I recently noticed that the registration page itself had changed, and that there is now a separate button to register “household CCTV”


If one clicks that button one is taken to a page which informs that, indeed, a £35 fee is payable, and that the information provided will be published online 


There is a link to the ICO’s overarching privacy notice [ed. you’re going to have to tighten that up for GDPR, guys] but the only part of that notice which talks about the registration process relates only to “businesses”


Continuing the household CCTV registration process, one then gets to the main screen, which requires that the responsible person in the household identify themselves as data controller, and give either their household or email address for publication


What this all means is that it is the ICO’s apparent view that if you use CCTV in your household and capture footage outside the boundaries of your property, you are required to register this fact publicly with them, and pay a £35 fee. The clear implication, in fact the clear corollary, is that failure to do so is a criminal offence.

(In passing, there is a problem here: the pages and the process miss the point that for the registration to be required, the footage needs to be capturing images of identifiable individuals, otherwise no personal data is being processed, and data protection law is simply not engaged. What if someone has installed a “nest cam” in a nearby wooded area? Is ICO saying they are committing a criminal offence if they fail to register this? Also, what if the footage does capture identifiable individuals outside the boundaries of a household, but the footage is only taken for household, rather than crime reduction purposes? The logical conclusion of the ICO pages here is that anyone who takes video footage anywhere outside their home must register, which contradicts their guidance elsewhere.)

What I find particularly surprising about all this is that, although fundamentally it is correct as a matter of law (following the Ryneš decision by the CJEU), I have seen no publicity from the ICO about this pretty enormous policy change. Imagine how many households potentially *should* register, and how many won’t? And, therefore, how many the ICO is implying are committing a criminal offence?

And one thing that is really puzzling me is why this change, now? The CJEU ruling was thirty months ago, and in another eleven months, European data protection law will change, removing – in the UK also – the requirement to register in these circumstances. If it was so important for the ICO to effect these changes before then, why keep it quiet?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, Directive 95/46/EC, GDPR, Information Commissioner, Uncategorized

FOI enforcement – if not now, when?

Recent ICO decision notices show the Home Office and MoJ repeatedly simply failing to respond to FOI requests. Surely the time has come for ICO action?

The Information Commissioner’s Office (ICO) recently stated to me that they were not monitoring the Home Office’s and Ministry of Justice’s (MoJ) compliance with the statutory timescales required by section 10 of the Freedom of Information Act 2000 (FOIA)

This was despite the fact that they’d published decision notices about delays by those two government bodies which reported that “The delay in responding to this request will be logged as part of ongoing monitoring of the MoJ’s compliance with the FOIA”. This was not formal monitoring, I was told; rather, it was informal monitoring. Ah. Gotcha.

So what does trigger formal monitoring? Interestingly, the ICO’s own position on this has recently changed, and got a bit stricter. It’s generally meant to be initiated in the following circumstances:

our analysis of complaints received by the ICO suggests that we have received in the region of 4 to 8 or more complaints citing delays within a specific authority within a six month period

(for those authorities which publish data on timeliness) – it appears that less than 90% of requests are receiving a response within the appropriate timescales. [this used to be 85%]

Evidence of a possible problem in the media, other external sources or internal business intelligence.

Despite the apparent increase in robustness of approach, the ICO do not appear to be monitoring any public authorities at the moment. The last monitoring took place between May and July 2016 when Trafford Council were in their sights. Although they are not mentioned in the relevant report, an ICO news item from July last year says that the Metropolitan Police, who have been monitored off and on for a period of years without any real outward signs of improvement, were also still being monitored.

But if they aren’t monitoring the compliance of any authorities at the moment, but particularly the Home Office and the MoJ, one is led to wonder why, when one notes the pattern in recent ICO decision notices involving those two authorities. Because, in 16 out of the last 25 decision notices involving the Home Office, and 6 out of the last 25 involving the MoJ, the ICO has formally issued decision notices finding that the authorities had failed to comply with the FOI request in question, by the time the decision notice was issued.

At this point, it might be helpful to explain the kind of chronology and process that would lead up to the issuing of such decision notices. First, a request must be made, and there will have been a failure by the authority to reply within twenty working days. Then, the requester will normally (before the ICO will consider the case) have had to ask for an internal review by the authority of its handling of the request. Then, the requester will have complained to the ICO. Then, the ICO will have normally made informal enquiries of the authority, effectively “geeing” them up to provide a response. Then, as still no response will have been sent, the ICO will have moved to issuing a formal decision notice. At any point in this process the authority could (and should) still respond to the original request, but no – in all of these cases (again – 16 of the last 25 Home Office decisions, 6 of the last 25 MoJ ones) the authorities have still not responded many months after the original request. Not only does this show apparent contempt for the law, but also for the regulator.

So why does the ICO not do more? I know many FOI officers (and their public authority employers) who work their socks off to make sure they respond to requests in a timely manner. In the absence of formal monitoring of (let alone enforcement action against) those authorities who seem to ignore their legal duties much of the time, those FOI officers would be forgiven for asking why they bother: it is to their credit that bother they still do.

Elizabeth Denham became Information Commissioner in July last year, bringing with her an impressive track record and making strong statements about enforcing better FOI compliance. Her first few months, with GDPR and Brexit to deal with, will not have been easy, and she could be forgiven for not having had the time to focus on FOI, but the pressing question now surely is “if not now, when?”

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Freedom of Information, Home Office, Information Commissioner

Why what Which did wears my patience thin

Pre-ticked consent boxes and unsolicited emails from the Consumers’ Association

Which?, the brand name of the Consumers’ Association, publishes a monthly magazine. In an era of social media, and online reviews, its mix of consumer news and product ratings might seem rather old-fashioned, but it is still (according to its own figures1) Britain’s best-selling monthly magazine. Its rigidly paywalled website means that one must generally subscribe to get at the magazine’s contents. That’s fair enough (although after my grandmother died several years ago, we found piles of unread, unopened even, copies of Which? She had apparently signed up to a regular Direct Debit payment, probably to receive a “free gift”, and had never cancelled it: so one might draw one’s own conclusion about how many of Which?’s readers are regular subscribers for similar reasons).

In line with its general “locked-down” approach, Which?’s recent report into the sale of personal data was, except for snippets, not easy to access, but it got a fair bit of media coverage. Intrigued, I bit: I subscribed to the magazine. This post is not about the report, however, although the contents of the report drive the irony of what happened next.

As I went through the online sign-up process, I arrived at that familiar type of page where the subject of future marketing is broached. Which? had headlined their report “How your data could end up in the hands of scammers” so it struck me as amusing, but also irritating, that the marketing options section of the sign-in process came with a pre-ticked box:

img_0770

As guidance from the Information Commissioner’s Office makes clear, pre-ticked boxes are not a good way to get consent from someone to future marketing:

Some organisations provide pre-ticked opt-in boxes, and rely on the user to untick it if they don’t want to consent. In effect, this is more like an opt-out box, as it assumes consent unless the user clicks the box. A pre-ticked box will not automatically be enough to demonstrate consent, as it will be harder to show that the presence of the tick represents a positive, informed choice by the user.

The Article 29 Working Party goes further, saying in its opinion on unsolicited communications for marketing purposes that inferring consent to marketing from the use of pre-ticked boxes is not compatible with the data protection directive. By extension, therefore, any marketing subsequently sent on the basis of a pre-ticked box will be a contravention of the data protection directive (and, in the UK, the Data Protection Act 1998) and the ePrivacy directive (in the UK, the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR)).

Nothwithstanding this, I certainly did not want to consent to receive subsequent marketing, so, as well as making a smart-arse tweet, I unticked the box. However, to my consternation, if not my huge surprise, I have subsequently received several marketing emails from Which? They do not have my consent to send these, so they are manifestly in contravention of regulation 22 of PECR.

It’s not clear how this has happened. Could it be a deliberate tactic by Which?  to ignore subscribers’ wishes? One presumes not: Which? says it “exists to make individuals as powerful as the organisations they deal with in their daily live” – deliberately ignoring clear expressions regarding consent would hardly sit well with that mission statement. So is it a general website glitch – which means that those expressions are lost in the sign-up process? If so, how many individuals are affected? Or is it just a one-off glitch, affecting only me?

Let’s hope it’s the last. Because the ignoring or overriding of expressions of consent, and the use of pre-ticked boxes for gathering consent, are some of the key things which fuel trade in and disrespect for personal data. The fact that I’ve experience this issue with a charity which exists to represent consumers, as a result of my wish to read their report into misuse of personal data, is shoddy, to say the least.

I approached Which? for a comment, and a spokesman said:

We have noted all of your comments relating to new Which? members signing up, including correspondence received after sign-up, and we are considering these in relation to our process.

I appreciate the response, although I’m not sure it really addresses my concerns.

1Which? Annual Report 2015/2016

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under consent, Data Protection, Directive 95/46/EC, Information Commissioner, marketing, PECR, spam, subject access

Data Protection distress compensation for CCTV intrusion

The Information Commissioner’s Office (ICO) recently (2 February) successfully prosecuted a business owner for operating CCTV without an appropriate notification under section 18 of the Data Protection Act 1998 (DPA), announcing:

Businesses could face fines for ignoring CCTV data protection law

But a recent case in the Scottish Sheriff Court shows that CCTV and data protection can also have relevance in private law civil proceedings. In Woolley against Akbar [2017] ScotsSC 7 the husband and wife pursuers (equivalent to claimants in England and Wales) successfully brought a claim for compensation for distress caused by the defender’s (defendant in England and Wales) use of CCTV cameras which were continuously recording video and audio, and which were deliberately set to cover the pursuers’ private property (their garden area and the front of their home). Compensation was assessed at £8634 for each of the pursuers (so £17268 in total) with costs to be assessed at a later date.

Two things are of particular interest to data protection fans: firstly, the willingness of the court to rule unequivocally that CCTV operated in non-compliance with the DPA Schedule One principles was unlawful; and secondly, the award of compensation despite the absence of physical damage.

The facts were that Mr and Mrs Woolley own and occupy the upper storey of a dwelling place, while Mrs Akbar owns and operates the lower storey as a guest house, managed by her husband Mr Akram. In 2013 the relationship between the parties broke down. Although both parties have installed CCTV systems, the pursuers’ system only monitors their own property, but this was not the case with the defender’s:

any precautions to ensure that coverage of the pursuers’ property was minimised or avoided. The cameras to the front of the house record every person approaching the pursuers’ home. The cameras to the rear were set deliberately to record footage of the pursuers’ private garden area. There was no legitimate reason for the nature and extent of such video coverage. The nature and extent of the camera coverage were obvious to the pursuers, as they could see where the cameras were pointed. The coverage was highly intrusive…the defender also made audio recordings of the area around the pursuers’ property…they demonstrated an ability to pick up conversations well beyond the pursuers’ premises. There are four audio boxes. The rear audio boxes are capable of picking up private conversations in the pursuers’ rear garden. Mr Akram, on one occasion, taunted the pursuers about his ability to listen to them as the pursuers conversed in their garden. The defender and Mr Akram were aware of this at all times, and made no effort to minimise or avoid the said audio recording. The nature of the coverage was obvious to the pursuers. Two audio boxes were installed immediately below front bedroom windows. The pursuers feared that conversations inside their home could also be monitored. The said coverage was highly intrusive.

Although, after the intervention of the ICO, the defender realigned the camera at the rear of the property, Sheriff Ross held that the coverage “remains intrusive”. Fundamentally, the sheriff held that the CCTV use was: unfair (in breach of the first data protection principle); excessive in terms of the amount of data captured (in breach of the third data protection principle); and retained for too long (in breach of the fifth data protection principle).

The sheriff noted that, by section 13(2) of the DPA, compensation for distress can only be awarded if the pursuer has suffered “damage”, which was not the case here. However, the sheriff further correctly noted, and was no doubt taken to, the decision of the Court of Appeal in Vidal-Hall & Ors v Google [2015] EWCA Civ 311 in which the court struck down section 13(2) as being incompatible with the UK’s obligations under the European data protection directive and the Charter of Fundamental Rights (my take on Vidal Hall is here). Accordingly, “pure” distress compensation was available.

Although the facts here show a pretty egregious breach of DPA, it is good to see a court understanding and assessing the issues so well, no doubt assisted in doing so by Paul Motion, of BTO Solicitors, who appeared for the pursuers.

One niggle I do have is about the role of the ICO in all this: they were clearly apprised of the situation, and could surely have taken enforcement action to require the stopping of the CCTV (although admittedly ICO cannot make an award of compensation). It’s not clear to me why they didn’t.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

4 Comments

Filed under damages, Data Protection, Information Commissioner

ICO FOI Decision Notices – insufficient attention to detail?

Anyone used to reading Freedom of Information Act 2000 (FOIA) decision notices from the Information Commissioner’s Office (ICO) will be familiar with this sort of wording:

The Commissioner has concluded that the public interest favours maintaining the exemption contained at section x(y) of FOIA. In light of this decision, the Commissioner has not gone on to consider the public authority’s reliance on section z(a) of FOIA.

In fact, a search on the ICO website for the words “has not gone on” throws up countless examples.

What lies behind this approach is this: a public authority, in refusing to disclose recorded information, is entitled to rely on more than one of the FOIA exemptions, because information might be exempt under more than one. An obvious example would be where information exempted from disclosure for the purposes of safeguarding national security (section 24 FOIA) would also likely to be exempt under section 31 (law enforcement).

One assumes that the ICO does this for pragmatic reasons – if information is exempt it’s exempt, and application of a further exemption in some ways adds nothing. Indeed, the ICO guidance for public authorities advises

you [do not]  have to identify all the exemptions that may apply to the same information, if you are content that one applies

Now, this is correct as a matter of law (section 78 of FOIA makes clear that, as a general principle, reliance by public authorities upon the Act’s exemptions is discretionary), and the ICO’s approach when making decisions is understandable, but it is also problematic, and a recent case heard by the Information Tribunal illustrates why.

In Morland v IC & Cabinet Office (EA/2016/0078) the Tribunal was asked to determine an appeal from Morland, after the Cabinet Office had refused to disclose to him minutes of the Honours and Decorations Committee, and after the ICO had upheld the refusal. As the Tribunal noted

The Cabinet Office refused the Appellant’s information request in reliance upon s. 37 (1) (b) and s. 35 (1) (a) of the Freedom of Information Act 2000 (“FOIA”) [and the ICO] Decision Notice found (at paragraph 13) that the exemption under s. 37 (1) (b) was 5 engaged by the request and (at paragraph 25) that the public interest favoured maintaining the exemption “by a narrow margin”.  The Decision Notice expressly did not consider the Cabinet Office’s reliance on s. 35 (1) (b). [emphasis added]

The problem arose because the Tribunal found that, pace the ICO’s decision, the exemption at section 37(1)(b) was not engaged (because that section creates an exemption to disclosure if the information relates to the conferring by the crown of an honour or dignity, and the information request related to whether an entirely new honour should be created). But what of the exemption at s35(1)(b)? Well, although it would not always be the case in similar circumstances, here the Tribunal and the parties were in a bind, because, as the Tribunal said

We are left with a situation where, as the Decision Notice did not reach a conclusion on that issue, none of the parties appear to have regarded s. 35 (1) (a) as being seriously in play in this appeal, with the effect that we have received limited argument on that issue

There is no power to remit a decision to the ICO (see IC v Bell [2014] UKUT 0106 (AAC) (considered in a Panopticonblog post here), so the Tribunal had to make findings in relation to s35, despite a “concern whether it is right to do so”. On the expressly limited evidence before it it found that the exemption was not engaged at the time of the request, and, accordingly, upheld Morland’s appeal, saying that it

[regarded] the failure of the Decision Notice to determine a key issue between the parties as rather unsatisfactory

Whether this will lead the ICO to revisit its apparent policy of, at least at times, focusing on only one of multiple claimed exemptions remains to be seen. It’s not often that I have sympathy with the Cabinet Office when it comes to matters of FOIA, but there is a modicum here.

Nonetheless, I think what this case does suggest is that a public authority should, when faced with an appeal of an ICO Decision Notice upholding a FOIA refusal, give strong consideration to whether it needs to be joined to the appeal (as, admittedly, the Cabinet Office was here) and to make sure that its response to the appeal (under part 27 of the Tribunal Rules) fully deals with all applicable exemptions, notwithstanding the contents of the Decision Notice. In this way, the Tribunal can, where necessary, take as fully-apprised a decision as possible on all of those exemptions.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Freedom of Information, Information Commissioner, Information Tribunal