Tag Archives: DPA

The most boring blogpost on this blog?

Although GDPR, and the Data Protection Act 2018 (DPA18), took effect from 25 May 2018, it has been notable that the Information Commissioner’s Office (ICO) has continued to exercise its enforcement powers under the prior law. There is no problem with this, and it is only to be expected, given that regulatory investigations can take some time. The DPA18 contains transitional provisions which mean that certain sections of the Data Protection Act 1998 continue to have effect, despite its general repeal. This is the reason, for instance, why the ICO could serve its recent enforcement notice on Hudson Bay Finance Ltd using the powers in section 40 of the 1998 – paragraph 33 of Schedule 20 to the DPA18 provides that section 40 of the 1998 Act continues to apply if the ICO is satisfied that the controller contravened the old data protection principles before the rest of the 1998 Act was repealed.

However, what is noticeable in the Hudson Bay Finance Ltd enforcement notice is that it says that it was prompted by a request for assessment by the complainant, apparently made on 21 September 2018, purportedly made under section 42 of the 1998 Act. I say “purportedly” because the transitional provisions in Schedule 20 of DPA18 require the ICO to consider a request for assessment made before 25 May 2018, but in all other respects, section 42 is repealed. Accordingly, as a matter of law, a data subject can (after 25 May 2018) no longer exercise their right to request an assessment under section 42 of the 1998 Act.

This is all rather academic, because it appears to me that the ICO has discretion – even if it does not have an obligation – to consider a complaint by a data subject relating to compliance with the 1998 Act. And ICO clearly (as described above) has the power still to take enforcement action for contraventions of the 1998 Act. But no one ever told me I can’t use my blog to make arid academic points.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, enforcement, Information Commissioner

Blagging as academic research

A white paper on GDPR subject access rights, presented at the Blackhat USA 2019 conference, got a lot of UK media coverage recently. Less discussion was had, however, about whether the research raised questions about the ethics and legality of “blagging”.

The paper, by Oxford University DPhil researcher James Pavur and Casey Knerr, talked of “Using Privacy Laws to Steal Identities” and describes Pavur’s attempts to acquire another person’s (Knerr’s) data, by purporting to be that person and pretending to exercise their access rights under Article 15 of the General Data Protection Regulation (GDPR). It should be emphasised that Knerr was fully acquiescent in the exercise.

Pavur and Knerr’s paper has a section entitled “Ethical and legal concerns” but what it notably fails to address is the fact that deliberately obtaining personal data without the consent of the controller is potentially a criminal offence under UK law.

Since 1998 it has been an offence to deliberately obtain personal data by deception, with defences available where the obtaining was, for instance, justified as being in the public interest. The Data Protection Act 2018 introduces, at section 170, a new defence where the obtaining is for academic purposes, with a view to publication and where the person doing the obtaining reasonably believes that it was justified in the public interest. Previously, this defence was only available where the obtaining was for the “special purposes” of journalism, literature or art.

It would certainly appear that Pavur obtained some of the data without the consent of the controller (the controller cannot properly be said to have consented to its disclosure if it was effected by deception – indeed, such is the very nature of “blagging”), but it also appears that the obtaining was done for academic purposes and with a view to publication and (it is likely) in the reasonable belief that the obtaining was justified in the public interest.

However, one would expect that prior to conducting the research, some analysis of the legal framework would have revealed the risk of an offence being committed, and that, if this analysis had been undertaken, it would have made its way into the paper. Its absence makes the publicity given to the paper by Simon McDougall, of the Information Commissioner’s Office (ICO), rather surprising (McDougall initially mistakenly thought the paper was by the BBC’s Leo Kelion). Because although Pavur (and Knell) could almost certainly fall back on the “academic purposes” defence to the section 170 offence, a fear I have is that others might follow their example, and not have the same defence. Another fear is that an exercise like this (which highlights risks and issues with which controllers have wrestled for years, as Tim Turner points out in his excellent blogpost on the subject) might have the effect of controllers becoming even more keen to demand excessive identification credentials for requesters, without considering – as they must – the proportionality of doing so.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner

Boris Johnson and GDPR

Might there have been a breach of data protection law in the recording, apparently by neighbours, of incidents at Boris Johnson’s home, and the passing of the recording to the media and the police? Almost certainly not.

(In this post I would like to avoid, as far as possible, broader ethical questions, and I will restrict any political observations to this: if Johnson becomes leader of the Conservative Party, and therefore prime minister, the two main UK political parties will be being led by people less fit to hold the role than at any time in my lifetime.)

In general, processing of personal data done for one’s own domestic purposes avoids the need for compliance with data protection law: Article 2(2)(c) of the General Data Protection Regulation (GDPR) – which of course provides the overarching statutory framework for most processing of personal data – says that the GDPR itself “does not apply to the processing of personal data…by a natural person in the course of a purely personal or household activity”. This is understandable: were there not such a carve-out, one’s children might, say, try to sue one for unlawful processing of their pocket-money data.

However, that word “purely” is key in Article 2. Processing which is not in the course of a “purely” domestic activity, such as, say, passing a recording of an altercation involving one’s neighbours to the media and the police, will be within GDPR’s scope.

So if GDPR is likely to apply, what are the considerations?

Firstly, passing information to the police about an altercation involving one’s neighbours is straightforward: GDPR permits processing which is necessary for the performance of a task carried out in the public interest (Article 6(1)(e)) and where the processing is necessary for the purposes of someone’s legitimate interests (provided that such interests are not overridden by the rights of the data subject) (Article 6(1)(f)).

But what of passing such information to the media? Well, here, the very broad exemption for the purposes of journalism will apply (even though the neighbours who are reported to have passed the information to the media are not, one assumes, journalists as such). GDPR requires members states to reconcile the right to the protection of personal data with the right to freedom of expression and information, including processing for journalistic purposes, and this obligation is given effect in UK law by paragraph 26 of Schedule 2 to the Data Protection Act 2018. This provides that the GDPR provisions (for the most part) do not apply to processing of personal data where it

is being carried out with a view to the publication by a person of journalistic, academic, artistic or literary material, and…the controller reasonably believes that the publication of the material would be in the public interest [and] the controller reasonably believes that the application of [the GDPR provisions] would be incompatible with the… purposes [of journalism].

Here, the controller is not just going to be the journalist or media outlet to whom the information was passed, but it is also likely to be the non-journalist person who actually passes the information (provides that the latter passes it with a view to its publication and does so under a reasonable belief that such publication would be in the public interest).

The equivalent exemption in the prior law (the Data Protection Act 1998) was similar, but, notably, applied to processing which was only carried for the purposes of journalism (or its statutory bedfellows – literature and art). The absence of the word “only” in the 2018 Act arguably greatly extends the exemption, or at least removes ambiguity (there was never any notable example of action being taken under the prior law against the media for processing which was alleged to be unlawful and which was for more than one purposes (i.e. not solely for the purposes of journalism)).

It seems almost certain, then, that Johnson’s non-journalist neighbours could avail themselves of the “journalism” exemption in data protection law. As could anyone who processes personal data with a view to its publication and who reasonably believes such publication is in the public interest: we should prepare to see this defence aired frequently over the coming years. Whether the exemption is too broad is another question.

Because of the breadth of the journalism exemption in data protection law, actions are sometimes more likely to be brought in the tort of misuse of private information (see, for example, Cliff Richard v BBC, and Ali v Channel 5). Whether such a claim might be available in this case is also another question, and not one for this blog.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, journalism, police

Information Tribunal rejects data subject appeals under new Data Protection Act

The Information Tribunal has recently heard the first applications under the Data Protection Act 2018 for orders regarding the Information Commissioner’s handling of data protection complaints. As I write on the Mishcon de Reya website, the Tribunal has peremptorily dismissed them.

Leave a comment

Filed under Data Protection, enforcement, GDPR, Information Commissioner, Information Tribunal

Farrow & Ball lose appeal for non-payment of data protection fee

I have a new post on the Mishcon de Reya website, drawing attention to the first (and unsuccessful) attempt to appeal an ICO monetary penalty for failing to pay the statutory data protection fee.

Leave a comment

Filed under Data Protection, Information Commissioner, Information Tribunal, monetary penalty notice

MPs, Lords, councillors exempt from data protection fee

As I have previously discussed on the Mishcon de Reya website, the General Data Protection Regulation (“GDPR”) removed the requirement at European law for data controllers to “register” with their supervisory authority. However, in the UK, the need to provide a funding stream for the data protection work of the Information Commissioner’s Office (ICO) led parliament to pass laws (The Data Protection (Charges and Information) Regulations 2018) (“the Fee Regulations”), made under sections 137 and 138 of the Data Protection Act 2018 (“DPA”)) requiring controllers to pay a fee to the ICO, unless an exemption applied.

New amendment regulations (The Data Protection (Charges and Information) (Amendment) Regulations 2019) have now been passed, following a consultation run by DCMS last year. These mean that new categories of exempt processing are introduced. In short, processing of personal data by members of the House of Lords, elected representatives and prospective representatives is also now “exempt processing” for the purposes of the Fee Regulations. “Elected representative” means (adopting the definition at paragraph 23(3)(a) to (d) and (f) to (m) of Schedule 1 to the DPA)

a member of the House of Commons;
a member of the National Assembly for Wales;
a member of the Scottish Parliament;
a member of the Northern Ireland Assembly;
an elected member of a local authority within the meaning of section 270(1) of the Local Government Act 1972
an elected mayor of a local authority within the meaning of Part 1A or 2 of the Local Government Act 2000;
a mayor for the area of a combined authority established under section 103 of the Local Democracy, Economic Development and Construction Act 2009;
the Mayor of London or an elected member of the London Assembly;
an elected member of the Common Council of the City of London, or the Council of the Isles of Scilly;
an elected member of a council constituted under section 2 of the Local Government etc (Scotland) Act 1994;
an elected member of a district council within the meaning of the Local Government Act (Northern Ireland) 1972;
a police and crime commissioner.

But, it should be noted, MEPs’ processing is not exempt, and, for the time being at least, they must still pay a fee.

6 Comments

Filed under Data Protection, DCMS, GDPR

The wheels of the Ministry of Justice

do they turn so slowly that they’ll lead to the Lord Chancellor committing a criminal offence?

On 21 December last year, as we were all sweeping up the mince piece crumbs, removing our party hats and switching off the office lights for another year, the Information Commissioner’s Office (ICO) published, with no accompanying publicity whatsoever, an enforcement notice served on the Secretary of State for Justice. The notice drew attention to the fact that in July 2017 the Ministry of Justice (MoJ) had had a backlog of 919 subject access requests from individuals, some of which dated back to 2012. And by November 2017 that had barely improved – to 793 cases dating back to 2014.

I intended to blog about this at the time, but it’s taken me around nine months to retrieve my chin from the floor, such was the force with which it dropped.

Because we should remember that the exercise of the right of subject access is a fundamental aspect of the fundamental right to protection of personal data. Requesting access to one’s data enables one to be aware of, and verify the lawfulness of, the processing. Don’t take my word for it – look at recital 41 of the-then applicable European data protection directive, and recital 63 of the now-applicable General Data Protection Regulation (GDPR).

And bear in mind that the nature of the MoJ’s work means it often receives subject access requests from prisoners, or others who are going through or have been through the criminal justice system. I imagine that a good many of these horrendously delayed requests were from people with a genuinely-held concern, or grievance, and not just from irritants like me who are interested in data controllers’ compliance.

The notice required MoJ to comply with all the outstanding requests by 31 October 2018. Now, you might raise an eyebrow at the fact that this gave the MoJ an extra eight months to respond to requests which were already incredibly late and which should have been responded to within forty days, but what’s an extra 284 days when things have slipped a little? (*Pseuds’ corner alert* It reminds me of Larkin’s line in The Whitsun Weddings about being so late that he feels: “all sense of being in a hurry gone”).

Maybe one reason the ICO gave MoJ so long to sort things out is that enforcement notices are serious things – a failure to comply is, after all, a criminal offence punishable on indictment by an unlimited fine. So one notes with interest a recent response to a freedom of information request for the regular updates which the notice also required MoJ to provide.

This reveals that by July this year MoJ had whittled down those 793 delayed cases to 285, with none dating back further than 2016. But I’m not going to start hanging out the bunting just yet, because a) more recent cases might well be more complex (because the issues behind them will be likely to be more current, and therefore potentially more complex, and b) because they don’t flaming well deserve any bunting because this was, and remains one of the most egregious and serious compliance failures it’s been my displeasure to have seen.

And what if they don’t clear them all by 31 October? The notice gives no leeway, no get-out – if any of those requests extant at November last year remains unanswered by November this year, the Right Honourable David Gauke MP (the current incumbent of the position of Secretary of State for Justice) will, it appears, have committed a criminal offence.

Will he be prosecuted?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under access to information, Data Protection, Directive 95/46/EC, GDPR, human rights, Information Commissioner, Ministry of Justice, Uncategorized

GDPR – an unqualified right to rectification?

Can FCA – or any data controller – any longer argue that it’s too expensive to have to rectify inaccurate personal data?

Amidst all the hoo-ha about the General Data Protection Regulation (GDPR) in terms of increased sanctions, accountability requirements and nonsense about email marketing, it’s easy to overlook some changes that it has also (or actually) wrought.

One small, but potentially profound difference, lies in the provisions around accuracy, and data subjects’ rights to rectification.

GDPR – as did its predecessor, the 1995 Data Protection Directive – requires data controllers to take “every reasonable step” to ensure that, having regard to the purposes of the processing, personal data which are inaccurate are erased or rectified without delay. Under the Directive the concomitant data subject right was to obtain from the controller, as appropriate the rectification, erasure or blocking of data. Under Article 16 of GDPR, however, there is no qualification or restriction of the right:

The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her.

I take this to mean that, yes, a controller must in general only take every reasonable step to ensure that inaccurate data is rectified (the “proactive obligation”, let us call it), but, when put on notice by a data subject exercising his or her right to rectification, the controller MUST rectify – and there is no express proportionality get-out (let us call this the “reactive obligation”).

This distinction, this significant strengthening of the data subject’s right, is potentially significant, it seems to me, in the recently-reported case of Alistair Hinton and the Financial Conduct Agency (FCA).

It appears that Mr Hinton has, for a number of years, been pursuing complaints against the FCA over alleged inaccuracies in its register of regulated firms, and in particular over an allegation that

a register entry which gave the impression both him [sic] and his wife were directors of a firm which the regulator had publicly censured

This puts into rather simple terms what appears to be a lengthy and complex complaint, stretching over several years, and which has resulted in three separate determinations by the Financial Regulators Complaints Commissioner (FRCC) (two of which appear to be publicly available). I no doubt continue to over-simplify when I say that the issue largely turns on whether the information on the register is accurate or not. In his February 2017 determination the FRCC reached the following conclusions (among others)

You and your wife have been the unfortunate victims of an unintended consequence of the design of the FSA’s (and now FCA’s) register, coupled with a particular set of personal circumstances;

…Since 2009 the FSA/FCA have accepted that your register entries are misleading, and have committed to reviewing the register design at an appropriate moment;

Although these findings don’t appear to have been directly challenged by the FCA, it is fair to note that the FCA are reported, in the determinations, as having maintained that the register entries are “technically and legally correct”, whilst conceding that they are indeed potentially misleading.

The most recent FRCC determination reports, as does media coverage, that the Information Commissioner’s Office (ICO) is also currently involved. Whilst the FRCC‘s role is not to decide whether the FCA has acted lawfully or not, the ICO can assess whether or not the FCA’s processing of personal data is in accordance with the law.

And it occurs to me that the difference here between the Directive’s “reactive obligation” and GDPR’s “reactive obligation” to rectify inaccurate data (with the latter not having any express proportionality test) might be significant, because, until now, FCA has apparently relied on the fact that correcting the misleading information on its register would require system changes costing an estimated £50,000 to £100,000, and the FRCC has not had the power to challenge FCA’s argument that the cost of “a proper fix” was disproportionate. But if the Article 16 right is in general terms unqualified (subject to the Article 12(5) ability for a controller to charge for, or refuse to comply with, a request that is manifestly unfounded or excessive), can FCA resist a GDPR application for rectification? And could the ICO decide any differently?

Of course, one must acknowledge that there is a general principle of proportionality at European law (enshrined in Article 5 of the Treaty of the European Union) so a regulator, or a court, cannot simply dispense with the concept. But there was clearly an intention by European legislature not to put an express qualification on the right to rectification (and by extension the reactive obligation it places on controllers), and that will need to be the starting point for any assessment by said regulator, or court.

 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under accuracy, Data Protection, GDPR, Information Commissioner

This old world will never change

Complacency about data protection in the NHS won’t change unless ICO takes firm action

Back in September 2016 I spoke to Vice’s Motherboard, about reports that various NHS bodies were still running Windows XP, and I said

If hospitals are knowingly using insecure XP machines and devices to hold and otherwise process patient data they may well be in serious contravention of their [data protection] obligations

Subsequently, in May this year, the Wannacry exploit indicated that those bodies were indeed vulnerable, with multiple NHS Trusts and GP practices subject to ransomware demands and major system disruption.

That this had enormous impact on patients is evidenced by a new report on the incident from the National Audit Office (NAO), which shows that

6,912 appointments had been cancelled, and [it is] estimated [that] over 19,000 appointments would have been cancelled in total. Neither the Department nor NHS England know how many GP appointments were cancelled, or how many ambulances and patients were diverted from the five accident and emergency departments that were unable to treat some patients

The NAO investigation found that the Department of Health and the Cabinet Office had written to Trusts

saying it was essential they had “robust plans” to migrate away from old software, such as Windows XP, by April 2015. [And in] March and April 2017, NHS Digital had issued critical alerts warning organisations to patch their systems to prevent WannaCry

Although the NAO report is critical of the government departments themselves for failure to do more, it does correctly note that individual healthcare organisations are themselves responsible for the protection of patient information. This is, of course, correct: under the Data Protection Act 1998 (DPA) each organisation is a data controller, and responsible for, among other things, for ensuring that appropriate technical and organisational measures are taken against unauthorised or unlawful processing of personal data.

Yet, despite these failings, and despite the clear evidence of huge disruption for patients and the unavoidable implication that delays in treatment across all NHS services occurred, the report was greeted by the following statement by Keith McNeil, Chief Clinical Information Officer for NHS England

As the NAO report makes clear, no harm was caused to patients and there were no incidents of patient data being compromised or stolen

In fairness to McNeil, he is citing the report itself, which says that “NHS organisations did not report any cases of harm to patients or of data being compromised or stolen” (although that is not quite the same thing). But the report continues

If the WannaCry ransomware attack had led to any patient harm or loss of data then NHS England told us that it would expect trusts to report cases through existing reporting channels, such as reporting data loss direct to the Information Commissioner’s Office (ICO) in line with existing policy and guidance on information governance

So it appears that the evidence for no harm arising is because there were no reports of “data loss” to the ICO. This emphasis on “data loss” is frustrating, firstly because personal data does not have to be lost for harm to arise, and it is difficult to understand how delays and emergency diversions would not have led to some harm, but secondly because it is legally mistaken: the DPA makes clear that data security should prevent all sorts of unauthorised processing, and removal/restriction of access is clearly covered by the definition of “processing”.

It is also illustrative of a level of complacency which is deleterious to patient health and safety, and a possible indicator of how the Wannacry incidents happened in the first place. Just because data could not be accessed as a result the malware does not mean that this was not a very serious situation.

It’s not clear whether the ICO will be investigating further, or taking action as a result of the NAO report (their response to my tweeted question – “We will be considering the contents of the report in more detail. We continue to liaise with the health sector on this issue” was particularly unenlightening). I know countless dedicated, highly skilled professionals working in the fields of data protection and information governance in the NHS, they’ve often told me their frustrations with senior staff complacency. Unless the ICO does take action (and this doesn’t necessarily have to be by way of fines) these professionals, but also – more importantly – patients, will continue to be let down, and in the case of the latter, put at the risk of harm.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under 7th principle, Data Protection, data security, enforcement, Information Commissioner, NHS

Public houses, private comms

Wetherspoons delete their entire customer email database. Deliberately.

In a very interesting development, the pub chain JD Wetherspoon have announced that they are ceasing sending monthly newsletters by email, and are deleting their database of customer email addresses.

Although the only initial evidence of this was the screenshot of the email communication (above), the company have confirmed to me on their Twitter account that the email is genuine.

Wetherspoons say the reason for the deletion is that they feel that email marketing of this kind is “too intrusive”, and that, instead of communicating marketing by email, they will “continue to release news stories on [their] website” and customers will be able to keep up to date by following them on Facebook and Twitter.

This is interesting for a couple of reasons. Firstly, companies such as Flybe and Honda have recently discovered that an email marketing database can be a liability if it is not clear whether the customers in question have consented to receive marketing emails (which is a requirement under the Privacy and Electronic Communications ((EC Directive) Regulations 2003 (PECR)). In March Flybe received a monetary penalty of £70,000 from the Information Commissioner’s Office (ICO) after sending more than 3.3 million emails with the title ‘Are your details correct?’ to people who had previously told them they didn’t want to receive marketing emails. These, said the ICO, were themselves marketing emails, and the sending of them was a serious contravention of PECR. Honda, less egregiously, sent 289,790 emails when they did not know whether or not the recipients had consented to receive marketing emails. This also, said ICO, was unlawful marketing, as the burden of proof was on Honda to show that they had recipients’ consent to send the emails, and they could not. The result was a £13,000 monetary penalty.

There is no reason to think Wetherspoons were concerned about the data quality (in terms of whether people had consented to marketing) of their own email marketing database, but it is clear from the Flybe and Honda cases that a bloated database with email details of people who have not consented to marketing (or where it is unclear whether they have) is potentially a liability under PECR (and related data protection law). It is a liability both because any marketing emails sent are likely to be unlawful (and potentially attract a monetary penalty) but also because, if it cannot be used for marketing, what purpose does it serve? If none, then it constitutes a huge amount of personal data, held for no ostensible purpose, which would be in contravention of the fifth principle in schedule 1 to the Data Protection Act 1998.

For this reason, I can understand why some companies might take a commercial and risk-based decision not to retain email databases – if something brings no value, and significant risk, then why keep it?

But there is another reason Wetherspoons’ rationale is interesting: they are clearly aiming now to use social media channels to market their products. Normally, one thinks of advertising on social media as not aimed at or delivered to individuals, but as technology has advanced, so has the ability for social media marketing to become increasingly targeted. In May this year it was announced that the ICO were undertaking “a wide assessment of the data-protection risks arising from the use of data analytics”. This was on the back of reports that adverts on Facebook were being targeted by political groups towards people on the basis of data scraped from Facebook and other social media. Although we don’t know what the outcome of this investigation by the ICO will be (and I understand some of the allegations are strongly denied by entities alleged to be involved) what it does show is that stopping your e-marketing on one channel won’t necessarily stop you having privacy and data protection challenges on another.

And that’s before we even get on to the small fact that European ePrivacy law is in the process of being rewritten. Watch that space.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, marketing, monetary penalty notice, PECR, social media, spam