Tag Archives: data protection

ICO statutory duty to promote economic growth

From time to time I can be a bit critical of the Information Commissioner’s Office (ICO). Indeed, in the past I may have criticised them for appearing to promote things or exercise their functions in a way that exceeded what their core role is. For instance, I may have queried why they frequently appear to be cheer-leading for innovation and digital economic expansion (not that I think those things are inherently to be avoided).

But it’s important to note that their functions are not limited to regulation of specific laws. Rather, under section 108 of the Deregulation Act 2015, and (made under that Act) The Economic Growth (Regulatory Functions) Order 2017, the ICO, as well as a host of other regulators, has a statutory duty to exercise her regulatory functions (other than those under FOIA, interestingly) with regard to the desirability of promoting economic growth. In particular, she has to consider the importance for the promotion of economic growth of exercising the regulatory function in a way which ensures that regulatory action is taken only when it is needed, and any action taken is proportionate.

Additionally, under section 110 of the Deregulation Act 2015 ICO (and other regulators) must also have regard to this guidance: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/603743/growth-duty-statutory-guidance.pdf

When people (again, I should include myself) question, for instance, the paucity in the UK of low-level GDPR fines for low-level infringements, they should take into account these provisions.

Whether this aspect of the Deregulation Act 2015 is actually reconcilable with the provisions of the GDPR (and, now, the UK GDPR) is a separate question. In principle, there need not be a clash between the promotion of economic growth and the regulation of compliance with the duty to observe the fundamental right to protection of personal data, but in practice, such clashes tend to occur.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, Uncategorized

Dashcams and domestic purposes

What do people use dashcams and cameras on cycle helmets for? I’m sure that some (especially in the latter group) use them to capture footage of interesting journeys they have made. But a considerable proportion of users – surely – use them in the event that the user is involved in a road traffic incident. Indeed the “National Dash Cam Safety Portal”, although provided by a commercial organisation selling cameras, is operated in partnership with, and enables upload of footage to, police forces in England and Wales, and its FAQ clearly inform people of the evidential nature and implications of such footage. And a recent piece on the “Honest John” website suggests that one in four dashcam submissions result in a prosecution. Whatever the intentions were of the people who used those dashcams to record that footage, it is undeniable that the outcome of the processing of personal data involved had a significant effect on the rights of those whose data was processed.

Article 2 of the UK GDPR says that the law’s scope does not extend to processing of personal data “by a natural person in the course of a purely personal or household activity”, and the case law of the Court of Justice of the European Union (at least insofar as such case law decided before 1 January 2021 is retained domestic law – unless departed from by the Court of Appeal or the Supreme Court) makes clear that use of recording cameras which capture footage containing personal data outwith the orbit of one’s property cannot claim this “purely personal or household activity” exemption (see, in particular the Ryneš case).

Yet the position taken by the authorities in the UK (primarily by the Information Commissioner’s Office (ICO)) largely fails to address the difficult issues arising. Because if the use of dashcams and helmet cams, when they result in the processing of personal data which is not exempt under under the “purely personal and household exemption, is subject to data protection law, then those operating them are, in principle at least, obliged to comply with all the relevant provisions of the UK GDPR, including: compliance with the Article 5 principles; providing Article 13 notices to data subjects; complying with data subject request for access, erasure, etc. (under Articles 15, 17).

But the ICO, whose CCTV guidance deals well with the issues to the extent that domestic CCTV is in issue, implies that use of dashcams etc, except in a work context, is not subject to the UK GDPR. For instance, its FAQs on registering as a data protection fee payer say “the use of the dashcam in or on your vehicle for work purposes will not be considered as ‘domestic’ and therefore not exempt from data protection laws”. It is very difficult to reconcile the ICO’s position here with the case law as exemplified in Ryneš.

And what raises interesting questions for me is the evidential status of this dashcam and helmet cam footage, when used in prosecutions. Although English law has traditionally tended to take the approach that evidence should be admitted where it is relevant, rather than excluding it on the grounds that it has been improperly obtained (the latter being a species of the US “fruit of the poisoned tree” doctrine), it is surely better for a court not to be faced with a situation where evidence may have been obtained in circumstances involving illegality.

If this was a passing issue, perhaps there would not need to be too much concern. However, it is clear that use of mobile video recording devices (and use of footage in criminal, and indeed civil, proceedings) is increasing and will continue to do so, at the same time as access to such devices, and the possibility for their covert or surreptitious use, also increases. It is, no doubt, a tremendously tricky area to regulate, or event to contemplate regulating, but that is no reason for the ICO to duck the issue.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under CCTV, crime, Data Protection, Information Commissioner, police

Start the DSAR countdown (but how?)

A while ago I wrote a piece on the Mishcon de Reya website pointing out that the Information Commissioner’s Office (ICO) had silently changed its guidance on how to calculate the “one month” timescale for responding to a subject access request under the General Data Protection Regulation (or “GDPR” – which is now domestic law in the form of the amended retained version of the GDPR, aka “UK GDPR”).

The nub of that piece was that the ICO (following the legal precedents) was now saying that “You should calculate the time limit from the day you receive the request“. Which was a change from the previous position that “You should calculate the time limit from the day after you receive the request “.

I have noticed, however, that, although the ICO website, in its UK GDPR guidance, maintains that the clock starts from the date of receipt, the guidance on “Law Enforcement Processing” (which relates to processing of personal data by competent authorities for law enforcement purposes under part 3 of the Data Protection Act 2018 (DPA), which implemented the Law Enforcement Directive) states that the time should be calculated

from the first day after the request was received

It’s not inconceivable (in fact I am given to understand it is relatively common) that a some controllers might receive a subject access request (or other data subject request) which must be dealt with under both the UK GDPR and the Law Enforcement Processing provisions (police forces are a good example of this). The ICO’s position means that the controller must calculate the response time as starting, on the one hand, on the date of receipt, and, on the other hand, on the day after the date of receipt.

And if all of this sounds a bit silly, and inconsequential, I would argue that it is certainly the former, but not necessarily the latter: failure to comply within a statutory timescale is a breach of a statutory duty, and therefore actionable, at least in principle. If the ICO really does believe that the timescale works differently under different legal schemes, then how, for instance can it properly determine (as it must, when required to) under Articles 57(1)(f) and 77(1) of the UK GDPR, or section 51(2) of the DPA, whether there has been a statutory infringement?

Statutory infringements are, after all, potentially actionable (in this instance either with regulatory action or private action by data subjects) – the ICO maintains a database of complaint cases and publishes some of this (albeit almost two years in arrears), and also uses (or may use) it to identify trends. If ICO finds that a controller has made a statutory infringement, that is a finding of potential significance: if that same finding is based on an unclear, and internally contradictory, interpretation of a key aspect of the law, then it is unlikely to be fair, and unlikely to be lawful.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, subject access, UK GDPR, Uncategorized

Tony Abbott hacking and data protection offences

The story about the hacking of Tony Abbott’s travel and other personal details, after he foolishly posted a picture of a flight boarding pass on social media, is both amusing and salutary (salutary for Abbott, and, I would suggest, Qantas and any other airline which prints boarding passes with similar details). What is also interesting to consider, is whether, if this hacking had occurred in the UK, it might have constituted an offence under data protection law.

Under section 170(1)(a) and 170(1)(c) of the Data Protection Act 2018 it is an offence for a person knowingly or recklessly…to obtain or disclose personal data without the consent of the controller, and also an offence for a person knowingly or recklessly…after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained.

There is at least an argument that this would have been a knowing obtaining of personal data without the consent of the controller (whether that controller was Qantas, or Abbott himself).

There are defences to both of these where the person can prove that the obtaining, disclosure, retaining etc. was in the particular circumstances, justified as being in the public interest.

Also, and this may be engaged here, it is a defence if the person acted for journalistic purposes, with a view to the publication by a person of any journalistic, academic, artistic or literary material, and in the reasonable belief that in the particular circumstances the obtaining, disclosing, retaining etc. was justified as being in the public interest. One does not have to be a paid journalist, or journalist by trade, to rely on this defence.

Prosecution in both cases may only be brought by the Information Commissioner, or with the consent of the Director of Public Prosecutions. The offences are triable either way, and punishable by an unlimited fine.

I write all this not to condemn the “hacker”, nor to condone Abbott. However, it is worth remembering that similar hacking, in the UK at least, is not without its risks.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, offences

GDPR’s scope – does it extend to China?

The answer to the question in the title is, of course, “yes”, if the processing in question is of personal data of data subjects in the EU, by a controller outside the EU, and related to the monitoring of data subjects’ behaviour as far as their behaviour takes place within the Union.

So, the activities of Zhenhua Data, in compiling its Overseas Key Individual Database, as described in The Mail, will be squarely within the scope of Article 3(2) of the General Data Protection Regulation (GDPR):

Boris Johnson and the Queen are among 40,000 Britons listed on a database compiled by a Chinese tech firm with reported links to Beijing’s military and intelligence networks, it can be disclosed.

Files on senior British politicians including the Prime Minister, members of the Royal Family, UK military officers and their families, and religious leaders are currently being stored by Zhenhua Data, a technology company based in Shenzhen, China as part of a ‘global mass surveillance system on an unprecedented scale’.

It seems difficult to imagine that the processing can possibly comply with GDPR. Where is the Article 14 notice? What is the Article 6 legal basis? Or the Article 9 exception to the general prohibition on processing special categories of data? Or the Article 30 record of processing activities? Or…or…or…?

But here’s the problem with any legislative attempt to extend the scope of laws beyond geographical and jurisdictional borders, to the activities of those who are not consulted, nor assigned rights, nor (in all likelihood) bothered: how does one enforce those laws? In 2018 (oh those heady early GDPR days!) the Information Commissioner’s Office (ICO) was reported to have told the Washington Post that its practice of only allowing those who paid for its premium subscription to refuse tracking cookies was unlawful. How many figs the WaPo gave is evidenced by a glance at its current subscription model:

(i.e. it appears to have changed nothing.)

Indeed, as the ICO said at the time

We hope that the Washington Post will heed our advice, but if they choose not to, there is nothing more we can do in relation to this matter

If there was nothing ICO could do against a newspaper outside the jurisdiction, consider how unrealistic is the idea that it might enforce against a Chinese company rumoured to work for the Chinese military, and which is said to view its mission as ‘using big data for the “great rejuvenation of the Chinese nation”‘.

The logical question, though, which arises is this – in the absence of an effective regulatory scheme to enforce them what exactly is the point of GDPR’s (or even more trenchantly, the UK GDPR’s) extra-territorial scope provisions?

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, enforcement, Europe, GDPR, Information Commissioner

If ICO won’t regulate the law, it must reboot itself

The exercise of the right of (subject) access under Article 15 of the General Data Protection Regulation (GDPR) is the exercise of a fundamental right to be aware of and verify the lawfulness of the processing of personal data about oneself.

That this is a fundamental right is emphasised by the range of enforcement powers available to the Information Commissioner’s Office (ICO), against those controllers who fail to comply with their obligations in response to an access request. These include the power to serve administrative fines to a maximum amount of €20m, but, more prosaically, the power to order the controller to comply with the data subject’s requests to exercise his or her rights. This, surely, is a basic function of the ICO – the sort of regulatory action which underlines its existence. This, much more than operating regulatory sandboxes, or publishing normative policy papers, is surely what the ICO is fundamentally there to do.

Yet read this, a letter shown to me recently which was sent by ICO to someone complaining about the handling of an access request:

 

Dear [data subject],

Further to my recent correspondence, I write regarding the way in which [a London Borough] (The Council) has handled your subject access request.

I have contacted the Council and from the evidence they have provided to me, as stated before, it appears that they have infringed your right to access under the GDPR by failing to comply with your SAR request. However, it does not appear as though they are willing to provide you with any further information and we have informed them of our dissatisfaction with this situation.

It is a requirement under the Data protection Act 2018 that we investigate cases to the ‘extent appropriate’ and after lengthy correspondence with the Council, it appears they are no longer willing co-operate with us to provide this information. Therefore, you may have better results if you seek independent legal advice regarding the matters raised in this particular case.

Here we have the ICO telling a data subject that it will not take action against a public authority data controller which has infringed her rights by failing to comply with an access request. Instead, the requester must seek her own legal advice (almost inevitably at her own significant cost).

Other controllers might look at this and wonder whether they should bother complying with the law, if no sanction arises for failing to do so. And other data subjects might look at it and wonder what is the point in exercising their rights, if the regulator will not enforce them.

This is the most stark single example in a collection of increasing evidence that the ICO is failing to perform its basic tasks of regulation and enforcement.

It is just one data subject, exercising her right. But it is a right which underpins data protection law: if you don’t know and can’t find out what information an organisation has about you, then your ability to exercise other rights is stopped short.

The ICO should reboot itself. It should, before and above all else, perform its first statutory duty – to monitor and enforce the application of the GDPR.

I don’t understand why it does not want to do so.

[P.S. I think the situation described here is different, although of the same species, to situations where ICO finds likely non-compliance but declines to take punitive action – such as a monetary penalty. Here, there is a simple corrective regulatory power available – an enforcement notice (essentially a “steps order”) under section 148 Data Protection Act 2018.]

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, GDPR, human rights, Information Commissioner

ICO – fines, what fines?

No surprise…but ICO has only issued four notices of intent to serve a fine since GDPR came into application (and one fine)

I made a quick Freedom of Information Act (FOIA) request a few weeks ago to the Information Commissioner’s Office (ICO), asking

since 25 May 2018
1) how many notices of intent have been given under paragraph 2(1) of schedule 16 to the Data Protection Act 2018?
2) How many notices of intent given under 1) have not resulted in a monetary penalty notice being given (after the period of 6 months specified in paragraph 2(2) of the same schedule to same Act)?

I have now received (4 September) received a response, which says that four notices of intent only have been issued in that time. Three of those are well known: one was in respect of Doorstep Dispensaree (who have since received an actual fine – the only one issued under GDPR – of £275,000); two are in respect of British Airways and of Marriott Inc., which have become long-running, uncompleted sagas; the identity of the recipient of the final one is not known at the time of writing.

The contrast with some other European data protection authorities is stark: in Spain, around 120 fines have been issued in the same time; in Italy, 26; in Germany (which has separate authorities for its individual regions), 26 also.

Once again, questions must be asked about whether the aim of the legislator, in passing GDPR, to homogenise data protection law across the EU, has been anywhere near achieved.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

One-stop shop starts to get interesting

The disagreement between the EU supervisory authorities over an Irish DPC draft decision could mark the start of a line of cases which the EDPB will need to resolve –  and maybe resolve to the consternation of the DPC, and Big Tech

As the UK hurtles backwards, blindfolded and with both arms tied behind its back, towards the precipice that is the end of the Brexit implementation period (31 December), and with no sign that the government is particularly pushing for an adequacy decision for the UK, it hardly seems worth it (the ICO is, for instance, already no longer a member) to analyse the implications of the news that the European Data Protection Board (EDPB) is being required to take its first binding decision pursuant to Article 65 of GDPR.

But I’m going to.

The Article 65 process has been triggered because an unspecified number of other supervisory authorities have raised objections (as they are entitled to) to the draft decision of the Irish Data Protection Commissioner (DPC) – the lead supervisory authority – in its investigation of of whether Twitter (more correctly “Twitter International Company”) complied with its personal data breach obligations under Article 33 of GDPR, in relation to a notification it made to the DPC in November 2018. In line with Articles 56 and 60, the DPC submitted its draft decision in May of this year. As this was a case involving cross-border processing, the DPC was required to cooperate with the other supervisory authorities concerned. One assumes, given the controller involved, that this meant the supervisory authorities of all member states. One also assumes that most complaints involving Big Tech (many of whom tend to base their European operations in Ireland, thus making the DPC the default lead supervisory authority) will similarly engage the supervisory authorities of all member states. The DPC already has many such complaint investigations, and, courtesy of civil society groups like “NOYB“, it is likely to continue to get many more.

Article 65 provides that where another supervisory authority “has raised a relevant and reasoned objection” to a draft decision of the lead supervisory authority, and the latter then doesn’t agree, then the EDPB must step in to consider the objection. The EDPB then has one month (two if the subject matter is complex) to reach a two-thirds majority decision, or, failing that, within a further two weeks, to reach a simple majority decision. The decision is binding on all the supervisory authorities.

And here’s where it gets interesting.

Because it must mean that, in circumstances where the EDPB agrees with an objection, then the lead supervisory authority will be bound to accept a decision it probably still does not agree with, and determine the substantive matter accordingly. In the context of the DPC, and its jurisdiction over the European processing of the world’s largest technology companies, this sounds like it might be a lot of fun. There are many supervisory authorities on the EDPB who take a substantially harder line than the DPC – if they end up being part of a simple majority which results in a “robust” binding decision, fur might well fly.

The controller being investigated appears to be able to challenge the EDPB’s decision by way of judicial review under Article 263 of the Treaty of the Functioning of the European Union. There is no direct route of appeal under the GDPR. But presumably an aggrieved controller may also potentially challenge the lead supervisory authority’s decision (which, remember, the latter might essentially disagree with) through the domestic courts, perhaps to the point where a referral to the CJEU could then also be made.

No doubt some of this may become clearer over the next few months. And, though it pains me to say it, and though it would be a development fraught with complexity and political shenanigans, maybe the UK will start to look like a more attractive place for Big Tech to base its European operations.

[This piece was updated on 24.08.20 to correct/clarify a point about the availability of judicial review of the EDPB].

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adequacy, Data Protection, EDPB, Europe, Ireland

Complaining

When A-Levels results were announced last week, the Information Commissioner’s Office (ICO) advised those unhappy with the processing of their personal data to

raise those concerns with the exam boards first, then report to us if they are not satisfied

And in its “service standards” the ICO even says

we expect you to give the organisation the opportunity to consider it first. In order for us to look at their information rights practices we need you to provide us with their reply [emphasis added]

and

Our role is not to investigate or adjudicate on every individual complaint. We are not an ombudsman.

(This last bit is, I would submit, correct – the ICO is not an ombudsman according to my understanding of such a role (under which an ombudsman has powers to investigate complaints, but only to make recommendations as a result, rather than legally enforceable orders). How this squares with Elizabeth Denham’s confident pronouncement in the foreword to the ICO’s Regulatory Action Policy that she is “both an educator and an ombudsman”, I’ve never quite grasped, but, in her support, the ICO is a member of the Ombudsman Association. What a muddle.)

As I mentioned a few days ago, the ICO does not have the power simply to refuse to investigate a complaint by a data subject – it must, under Article 77 of GDPR, handle complaints and investigate them “to the extent appropriate”. I can see that in normal cases, it might be beneficial, and provide a complete picture, for there to have been correspondence between the data subject and the controller, but in some other cases, it hardly seems helpful, let alone a legal requirement, to raise a complaint with a controller first. So data subjects do not have to complain to exam boards first. (Please note – I’m not encouraging, or wishing for, a flood of complaints to be made to ICO, but, equally, if data subjects have specific complaint rights under GDPR, we (and I include the ICO in “we”) can’t just pretend they don’t exist.)

So, if data subjects were to complain to (and hold their ground with) ICO, what would happen next? How long does an investigation take?

As to the last question, oddly, it is difficult to know. In recent months, I have asked ICO on a few occasions through their chat service how long data protection complaints are taking merely to be allocated to a caseworker. I have regularly been told that cases are taking around three months to be allocated (a Freedom of Information request by someone else from June last year got the same figure). However, the ICO’s annual report, published only a few weeks ago says, at page 50, “we unfortunately have not been able to meet our target of 80% of [data protection] cases being resolved within 12 weeks” but they have achieved 74% being resolved within 12 weeks. I may be missing something, but how can 74% of data protection cases have been resolved within 12 weeks, when 100% of them are not allocated to a caseworker until 12 weeks have passed? The only way I can square these figures is if caseworkers “resolve” 74% of cases effectively on the day they get them. If that is the case, it might raise questions of the amount of rigour in the investigation process.

In any case, it seems clear that if an aggrieved student wished to complain about the processing of her personal data during the awarding of A-Levels this year, she would a) (probably wrongly) be expected by ICO first to complain to the exam body, then wait to receive a response, before b) then complaining to the ICO, and waiting three months for her complaint to be allocated to a caseworker. At that point, she might have her complaint investigated in line with Article 77 of GDPR. If the best a student this year might expect would be that her complaint might get allocated to a caseworker by December, more than three months after the distressing debacle which was the awards process, would the ICO realistically be said to be complying with its Article 57(1)(f) task to investigate complaints “within a reasonable period”?

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, Information Commissioner

GDPR compensation claims – not all infringements are alike

A very interesting piece by my Mishcon de Reya colleague Adam Rose, distinguishing between different types of GDPR infringement, and looking at which types the courts might consider justify compensation/damages awards (hint: by no means all).

Leave a comment

Filed under damages, Data Protection, GDPR