Category Archives: Data Protection Act 2018

Monitoring of lawyers by the state

In the Commons on Monday Robert Jenrick, minister for immigration, said, in the context of a debate on the implications of the violent disorder outside a hotel providing refuge for asylum seekers, in Knowsley on 10 February, and in answer to a question about why no “small boats bill” has been introduced into Parliament

this is one of the most litigious areas of public life. It is an area where, I am afraid, human rights lawyers abuse and exploit our laws at times, and where the courts have taken an expansive approach in the past. That is why we must get this right, but we will be bringing forward that legislation very soon

When pressed on his reference to abuse of the law by lawyers, and asked “how many solicitors, advocates and barristers have been reported by the Home Office in the last 12 months to the regulatory authorities”, Mr Jenrick replied

We are monitoring the activities, as it so happens, of a small number of legal practitioners, but it is not appropriate for me to discuss that here.

This is a remarkable statement, both in its lack of detail and in its potential effect. The prospect of the monitoring of lawyers by the state carries chilling implications. It may well be that Mr Jenrick had no intention of making what could be interpreted as an oppressive statement, but words are important, and words said in Parliament carry particular weight.

It may also be that the “monitoring” in question consists of legitimate investigation into potential criminality by that “small number” of lawyers, but if that was the case, why not say so?

But “monitoring”, in itself, must be done in accordance with the law. If it is in the context of a criminal investigation, or surveillance, there are specific laws which may apply.

And to the extent that it involves the processing of personal data of the lawyers in question (which, inevitably, it surely must, when one considers that “processing” means, among other things “collection, recording, organisation, structuring or storage” performed on personal data) the monitoring must comply with applicable data protection laws).

As a fundamental general principle, processing of personal data must be transparent (see Articles 5(1)(a), 13 and 14 UK GDPR, or, for law enforcement processing, section 44 of the Data Protection Act 2018 (DPA), or, for Intelligence Services Processing, section 93 of the DPA.

There are qualifications to and exemptions from this general principle, but, in the absence of circumstances providing such an exemption, a data subject (here, the lawyers who are apparently being monitored) should be made aware of the processing. The information they should receive includes, among other things: the identity and the contact details of the person directing the processing; the legal basis and the purposes of the processing, and; the recipients or categories of recipients of the personal data.

We tend to call the notices we receive under these provisions “privacy notices”. Those of us who have practised data protection law for a long time will remember the term “fair processing notice” which is arguably a better term. Whatever one calls them, though, such notices are a bedrock of the law – without being aware of the processing, and the risks, rules, safeguards and rights in relation to it, data subjects cannot properly exercise their rights.

With all that in mind, has the Home Office – or whoever it is who is directing the monitoring of the “small number of lawyers” – informed them that they are being monitored? If not, why not?

Returning to my earlier comments about the oppressiveness of comments to the effect that, or the giving of a perception that, the coercive powers of the state are being deployed against lawyers by monitoring them, one wonders if the Information Commissioner should take steps to investigate the background to Mr Jenrick’s comments.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, transparency, surveillance, human rights, Home Office, privacy notice, Data Protection Act 2018, law enforcement, monitoring

ICO threatened Matt Hancock with £17.5m fine (sort of)

It’s well known that, under the UK GDPR, and the Data Protection Act 2018 (DPA), the Information Commissioner can fine a controller or a processor a maximum of £17.5m (or 4% of global annual turnover). Less well known (to me at least) is that he can fine any person, including you, or me, or Matt Hancock, the same, even if they are not a controller or processor.

Section 142 of the DPA empowers the Commissioner to serve “Information Notices”. These fall broadly into two types: those served on a controller or processor requiring them to provide information which the Commissioner reasonably requires for the purposes of carrying out his functions under the data protection legislation; and those requiring

any person to provide the Commissioner with information that the Commissioner reasonably requires for the purposes of—

(i)investigating a suspected failure of a type described in section 149(2) or a suspected offence under this Act, or

(ii)determining whether the processing of personal data is carried out by an individual in the course of a purely personal or household activity.

And by section 155(1) of the DPA, the Commissioner may serve a monetary penalty notice (aka “fine”) on any “person” who fails to comply with an Information Notice. That includes you, or me, or Matt Hancock. (Section 157(4) provides that the maximum amount is £17.5m, or 4% of global annual turnover – although I doubt that you, I, or Matt Hancock has an annual global turnover.)

All very interesting and theoretical, you might think. Well, so might Matt Hancock have thought, until an Information Notice (which the Commissioner has recently uploaded to the ICO website) dropped onto his figurative doormat last year. The Notice was in relation to the Commissioner’s investigation of the leaking of CCTV images showing the former Secretary of State for Health and Social Care and his former aide enjoying each other’s company. The investigation – which was into the circumstances of the leak, and not Matt Hancock’s conduct – concluded in April of this year, with the ICO deciding that there was insufficient evidence to justify further action. But the Notice states clearly at paragraph 7 that failure to comply is, indeed, punishable with a fine of up to £17.5m (etc.).

The Matt Hancock Notice admittedly addresses him as if he were a controller (it says the ICO is looking at his compliance with the UK GDPR) although I am not sure that is correct – Matt Hancock will indeed be a controller in respect of his constituency work, and his work as an MP outside ministerial duties, but the normal approach is that a ministerial department will be the relevant controller for personal data processed in the context of that department (thus, the Department for Health and Social Care shows as a controller on the ICO register of fee payers).

Nonetheless, the ICO also issued an Information Notice to Matt Hancock’s former aide (as well as to Helen Whateley MP, the Minister of State), and that one makes no mention of UK GDPR compliance or a suggestion she was a controller, but does also “threaten” a potential £17.5m fine.

Of course, realistically, no one, not even Matt Hancock, was really ever at risk of a huge fine (section 155(3) of the DPA requires the Commissioner to have regard to various factors, including proportionality), but it strikes me as a remarkable state of affairs that you, I or any member of the public caught up in a matter that leads to ICO investigation, and who might have relevant information, is as a matter of law vulnerable to a penalty of £17.5m if they don’t comply with an Information Notice.

Even Matt Hancock.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, Information Commissioner, information notice, monetary penalty notice, UK GDPR

Certainly uncertain – data protection reform developments

In recent weeks the future of data protection law in the UK has been not just hard to predict, but also hard to keep up with.

Since Brexit, the UK has had its own version of the EU’s GDPR, called, obviously enough, the “UK GDPR“. Then, on 18 July, a Data Protection and Digital Information Bill was presented in Parliament – it proposed some significant (but possibly not hugely so) changes to the current regime, but it retained the UK GDPR. It was scheduled to have its second reading in the House of Commons on 5 September, but this was postponed “to allow Ministers to consider the legislation further”.  

Following this, on 22 September, the Retained EU Law (Revocation and Reform) Bill was introduced. This appeared to propose the “sunsetting” (i.e. the repeal) of multiple data and information laws, including the UK GDPR, by the end of 2023.

The next development, on the first day of the Conservative Party conference, is the announcement by the Culture Secretary, Michelle Donelan, that

we will be replacing GDPR with our own business and consumer-friendly data protection system… Many…smaller organisations and businesses only in fact employ a few people. They don’t have the resources or money to negotiate the regulatory minefield that is GDPR. Yet right now, in the main, they’re forced to follow this one-size-fits-all approach.

She also suggested that businesses had suffered from an 8% reduction in profit from GDPR. It is not immediately clear where this figure comes from, although some have suggested that an Oxford Martin School paper is the source. This paper contains some remarkably complex equations. I have no competence in assessing, and no reason to doubt, the authors’ economic and statistical prowess, but I can say (with a nod to the ageless concept of “garbage in, garbage out”) that their understanding of data protection law is so flawed as to compromise the whole paper. They say, for instance

websites are prohibited from sharing user data with third parties, without the consent from each user

and

companies that target EU residents are required to encrypt and anonymise any personal data it [sic] stores

and (probably most bizarrely)

as users incur a cost when prompted to give consent to using their data, they might reduce online purchases, leading to lower sales

To be quite clear (as politicians are fond of saying): websites are not prohibited from sharing data without the consent from “users” (if they were, most ecommerce would grind to a halt, and the internet economy would collapse); companies subject to GDPR are not required to anonymise personal data they store (if they did, they would no longer be able to operate, leading to the collapse of the economy in general); and “users” do not have to consent to the use of their data, and I am still scratching my head at why even if they did they would incur a cost.

If the authors base their findings on the economic cost of GDPR on these bases, then there are some very big questions for them to answer from anyone reviewing their paper.

I may have the wrong paper: I actually really hope the government will back up its 8% figure with something more sensible.

But regardless of the economic thinking this paper, or underpinning the developments in the statutory regime, it is possible that all the developments cohere: that the Data Protection and Digital Information Bill, when it re-emerges, will have been amended so as to have the effect of removing references to “GDPR” or the “UK GDPR”, and that this will mean that, in substance, if not in name, the principles of the UK GDPR are assimilated into a new piece of domestic legislation.

But (given that the government’s focus is on it) business, just as nature, abhors a vacuum – many business owners (and indeed many data protection practitioners) must be hoping that there is a clear route forward so that the UK’s data protection regime can be considered, and applied, with at least a degree of certainty.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adequacy, consent, Data Protection, Data Protection Act 2018, Data Protection Bill, GDPR, parliament, UK GDPR

ICO secures court-awarded compensation

ICO often say they can’t award compensation, but what they can do is – in criminal cases – make an application for the court to make an award (separate to any fines or costs). But as far as I know, until this case last week, they’d never done so:

https://www.mishcon.com/news/ico-recommends-compensation-awards-in-criminal-prosecution-case

Leave a comment

Filed under crime, damages, Data Protection, Data Protection Act 2018, Information Commissioner

High Court muddle over data protection regime

A relatively common error by those unaccustomed to the rather odd structure of the data protection statutory regime in the UK, is to look first to the Data Protection Act 2018 (“DPA”) for the applicable law, instead of the UK GDPR. This is despite the fact that the very first section of the DPA instructs us in how the regime works. Section 1(2) provides that “most processing of personal data is subject to the UK GDPR”, and then sections 1(4) and (5) explain that Parts 3 and 4 of the DPA deal with those parts of the regime (law enforcement processing and intelligence services processing) which are out of the scope of UK GDPR.

“Put me to one side” – says the DPA tactfully – “you should have picked up your copy of the UK GDPR first, and not me”.

Accordingly, the key provisions, and the basic principles, applying to most processing, are to be found in the UK GDPR.

The result of this relatively common error, is that people will sometimes cite, say, section 45 of the DPA in relation to a generic subject access request, when in fact, the applicable provision is Article 15 of the UK GDPR (section 45 applies to subject access requests to competent authorities for the purposes of law enforcement).

Occasionally, I have seen non-specialist lawyers make this mistake.

And now, I have seen a high court judge do the same. In a judicial review case in the High Court of Northern Ireland, challenging the accuracy of a child’s social care records, part of the claim (which was primarily an Article 8 human rights claim) was pleaded as also a breach of Article 5(1) and (6) of the “GDPR” (the correct pleading should have been, and maybe was, by reference to the UK GDPR) and Part 1 of the DPA. Article 5(1) of the UK GDPR contains the data protection principles.

The judge, however, stated that

It seems to the court that in fact the relevant part of the 2018 Act are sections 86 to 91 which set out the six data protection principles in relation to data processing.

This is simply wrong. Sections 86 to 91 of the DPA lay out the data protection principles only in relation to intelligence services processing (i.e. processing of personal data by the Security Service, the Secret Intelligence Service or by the Government Communications Headquarters).

It isn’t clear whether there was any discussion about this in the court (quite possibly not), but it appears not to have been picked up when the judgment was circulated in draft or published to the parties. As it is, it seems very likely that nothing turns on it. This is because the Part 4 DPA principles, like the Part 3 DPA principles, effectively mirror the principles in Article 5(1) UK GDPR, and so the analysis, for the purposes of the substantive matter, was sound.

So this was an error of form, more than substance.

However, there are some differences between the UK GDPR regime, the Part 3 DPA regime and the Part 4 DPA regime, and in different circumstances an error like this could result in an outcome which is wrong, and harmful.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Data Protection Act 2018, GDPR, human rights, Ireland, judiciary, UK GDPR

Data Protection reform bill – all that? or not all that?

I’ve written an “initial thoughts” analysis on the Mishcon de Reya website of the some of the key provisions of the Data Protection and Digital Information Bill:

The Data Protection and Digital Information Bill – an (mishcon.com)

Leave a comment

Filed under adequacy, Data Protection, Data Protection Act 2018, Data Protection Bill, DPO, GDPR, Information Commissioner, PECR, UK GDPR

Data protection nonsense on gov.uk

It feels like a while since I randomly picked on some wild online disinformation about data protection, but when you get an itch, you gotta scratch, and this page of government guidance for businesses – “Get your business ready to employ staff: step by step” – specifically on “Personal data an employer can keep about an employee” certainly got me itching. It starts off sensibly enough by saying that

Employers must keep their employees’ personal data safe, secure and up to date.

This is true (Article 5(1)(f) and part of 5(1)(c) UK GDPR). And the page goes on to list some information can be “kept” (for which I charitably read “processed”) without employees’ permission, such as: name, address, date of birth, sex, education and qualifications, work experience, National Insurance number, tax code, emergency contact details, employment history with the organisation, employment terms and conditions, any accidents connected with work, any training taken, any disciplinary action. All pretty inoffensive, although I’m not sure what it’s trying to achieve. But then…oh my. Then, it says

Employers need their employees’ permission to keep certain types of ’sensitive’ data

We could stop there really, and snigger cruelly, Consent (aka “permission”) as a condition for processing personal data is complicated and quite frankly to be avoided if possible. It comes laden with quite strict requirements. The Information Commissioner puts it quite well

Consent is appropriate if you can offer people real choice and control over how you use their data, and want to build their trust and engagement. But if you cannot offer a genuine choice, consent is not appropriate. If you would still process the personal data without consent, asking for consent is misleading and inherently unfair…employers and other organisations in a position of power over individuals should avoid relying on consent unless they are confident they can demonstrate it is freely given

And let’s consider the categories of personal data the government page thinks employers should get “permission” to “keep”: race and ethnicity, religion, political membership or opinions, trade union membership, genetics [sic], biometrics, , health and medical conditions, sexual history or orientation.

But how quickly would an employer’s wheels grind to a halt if it couldn’t process personal data on an employee’s health “without her permission”? It would be unable to refer her to occupational health if she didn’t “permit” it. It would be unable to keep a record of her sickness absence if she withdrew her consent (consent should be as easy to withdraw as it is to give (see Article 7(3)). During the COVID pandemic, it would have been unable to keep a record of whether she had tested positive or not, if she said she didn’t want a record kept.

It’s nonsense, of course. There’s a whole range of gateways, plus a whole Schedule of the Data Protection Act 2018), which provide conditions for processing special categories of data without having to get someone’s consent. They include pressing social imperatives, like compliance with public health law, and promotion of equality of treatment and safeguarding of children or other vulnerable people. The conditions don’t apply across the board, but the point is that employees’ permission – their consent – is rarely, if ever, required when there is another compelling reason for processing their data.

I don’t really understand what need, what gap, the government page is trying to fill, but the guidance is pretty calamitous. And it is only likely to lead to confusion for business owners and employers, and runs the risk of pitting themselves against each other – with disputes arising – amidst the confusion.

BAH!

Now, that felt better. Like I say, sometimes it’s good to scratch that itch.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, Data Protection Act 2018, Let's Blame Data Protection, UK GDPR

Podcast on UK data protection reforms

My Mishcon de Reya colleague Adam Rose and I have recorded a short (25 minute) podcast on the government’s recent announcement of proposed data protection reforms.

UK Data Reform – what’s being proposed? (mishcon.com)

Leave a comment

Filed under adequacy, Data Protection, Data Protection Act 2018, GDPR, UK GDPR

Data reform – hot news or hot air?

I’ve written a piece for the Mishcon de Reya website on the some of the key proposals (for our client-base) in today’s data protection reform announcement.

Data protection law reform – major changes, but the (mishcon.com)

Leave a comment

Filed under adequacy, consent, cookies, Data Protection, Data Protection Act 2018, DPO, GDPR, Information Commissioner, international transfers, nuisance calls, PECR, UK GDPR

COVID booster messages and the law

GET BOOSTED NOW Every adult needs a COVID-19 booster vaccine to protect against Omicron. Get your COVID-19 vaccine or booster. See NHS website for details

On Boxing Day, this wording appears to have been sent as an SMS in effect to every mobile telephone number in the UK. The relevant government web page explains that the message is part of the national “Get Boosted Now” campaign to protect against the Omicron variant of COVID-19. The web page also thanks the Mobile Network Operators for “their assistance in helping deliver the vitally important Get Boosted Now message”.

It is inevitable that questions may get raised raised about the legality of the SMSs under data protection law. What is important to note is that, although – to the extent that the sending involved the processing of personal data – the GDPR may apply (or, rather, the UK GDPR) the relevant law is actually the Privacy and Electronic Communications (EC Directive) Regulations 2003 (“PECR”). Under the doctrine of lex specialis where two laws govern the same situation, the more specific rules will prevail over more general rules. Put another way, if the more specific PECR can justify the sending of the SMSs, then the sending will also be justified under the more general provisions of UK GDPR.

Regulation 16A of PECR (inserted by a 2015 amendment), provides that where a “relevant communications provider” (in this case a Mobile Network Operator) is notified by a government minister (or certain other persons, such as chief constables) that an “emergency” has occurred, is occurring or is about to occur, and that it is expedient to use an emergency alert service, then the usual restrictions on the processing of traffic and location data can be disregarded. In this instance, given the wording on the government website, one assumes that such a notification was indeed made by a government minister under regulation 16A. (These are different emergency alerts to those proposed to be able to be sent under the National Emergency Alert system from 2022 which will not directly involve the mobile network operators.)

“Emergency” is not defined in PECR, so presumably will take its definition here from section 1(1)(a) of the Civil Contingencies Act 2004 – “an event or situation which threatens serious damage to human welfare in a place in the United Kingdom”.

The effect of this is that, if the SMSs are legal under PECR, they will also be legal under Article 6(1)(c) and 6(1)(e) of the UK GDPR (on the grounds that processing is necessary for compliance with a legal obligation to which the controller is subject, and/or necessary for the performance of a task carried out in the public interest).

There is an interesting side note as to whether, even though the SMSs count as emergency alerts, they might also be seen as direct marketing messages under regulations 22 and 23 of PECR, thus requiring the content of the recipient before they could be sent. Under the current guidance from the Information Commissioner (ICO), one might argue that they would be. “Direct marketing” is defined in the Data Protection Act 2018 as “the communication (by whatever means) of advertising or marketing material which is directed to particular individuals” and the ICO defines it further by saying that this “covers any advertising or marketing material, not just commercial marketing. All promotional material falls within this definition, including material promoting the aims of not-for-profit organisations”. Following that line of thought, it is possible that the Omicron SMSs were both emergency alerts and direct marketing messages. This would be an odd state of affairs (and one doubts very much that a judge – or the ICO, if challenged on this – would actually agree with its own guidance and say that these SMSs were indeed direct marketing messages). The ICO is in the process of updating its direct marketing guidance, and might be well advised to consider the issue of emergency alerts (which aren’t covered in the current consultation document).

[Edited to add: I don’t think what I say above necessarily covers all the legal issues, and no doubt there are aspects of this that could have been done better, but I doubt very much there is any substantive legal challenge which can be made.]

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under communications data, consent, Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, PECR, UK GDPR