Tag Archives: DPA

Tony Abbott hacking and data protection offences

The story about the hacking of Tony Abbott’s travel and other personal details, after he foolishly posted a picture of a flight boarding pass on social media, is both amusing and salutary (salutary for Abbott, and, I would suggest, Qantas and any other airline which prints boarding passes with similar details). What is also interesting to consider, is whether, if this hacking had occurred in the UK, it might have constituted an offence under data protection law.

Under section 170(1)(a) and 170(1)(c) of the Data Protection Act 2018 it is an offence for a person knowingly or recklessly…to obtain or disclose personal data without the consent of the controller, and also an offence for a person knowingly or recklessly…after obtaining personal data, to retain it without the consent of the person who was the controller in relation to the personal data when it was obtained.

There is at least an argument that this would have been a knowing obtaining of personal data without the consent of the controller (whether that controller was Qantas, or Abbott himself).

There are defences to both of these where the person can prove that the obtaining, disclosure, retaining etc. was in the particular circumstances, justified as being in the public interest.

Also, and this may be engaged here, it is a defence if the person acted for journalistic purposes, with a view to the publication by a person of any journalistic, academic, artistic or literary material, and in the reasonable belief that in the particular circumstances the obtaining, disclosing, retaining etc. was justified as being in the public interest. One does not have to be a paid journalist, or journalist by trade, to rely on this defence.

Prosecution in both cases may only be brought by the Information Commissioner, or with the consent of the Director of Public Prosecutions. The offences are triable either way, and punishable by an unlimited fine.

I write all this not to condemn the “hacker”, nor to condone Abbott. However, it is worth remembering that similar hacking, in the UK at least, is not without its risks.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, offences

An Uber-reaction in The Times

“Uber gives police private data on drivers and passengers” announces The Times(£) this morning.

In this post, much to my surprise (I have never taken an Uber, and don’t intend to – I don’t like their business model), I come to the defence of Uber.

A closer read of the Times piece reveals that what is being referred to, in documents filed with the High Court, in proceedings regarding TfL’s refusal to renew Uber’s licence, is requests to Uber from the police to disclose personal data for the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders.

Such requests are commonly made to thousands of public authorities and private companies. They used to be known in data protection and police circles as “section 29 requests”, after the relevant section of the now-repealed Data Protection Act 1998. The term was a bit misleading: section 29, now replaced effectively by paragraph 2 of Schedule 2 to the Data Protection Act 2018, has the effect of disapplying the provisions of data protection law which would otherwise prevent the disclosure of personal data to the police (or others), and where not disclosing would be likely to prejudice the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders. This is a necessary provision of data protection law, and provided that (as with all provisions) it is applied correctly and proportionately, it works very well: it gives controller the power to disclose personal data to the police where it is necessary for criminal justice.

If Uber are dealing with police requests appropriately, it is for the public good that personal data which assists the police to investigate drug transporting and human trafficking is made available to them.

In fact, I strongly suspect that The Times will receive such requests from the police. When the requests are related to the paper’s journalistic activities they are probably, and probably rightfully, refused, but they may well get requests in respect of their employees’ data, and I would be very surprised if they don’t sometimes – as a responsible company – comply with these.

Transport for London certainly receives such requests. Indeed, as a public authority, under its transparency measures, it has habitually made statistics on this public. The most recent publication I can find shows that 2012 to 2017 TfL received an average of approximately 10,000 requests each year.

Will The Times now report that TfL is handing over to the police thousands of pieces of intelligence on members of the public each year?

Leave a comment

Filed under Data Protection, Data Protection Act 2018, data sharing, police

If ICO won’t regulate the law, it must reboot itself

The exercise of the right of (subject) access under Article 15 of the General Data Protection Regulation (GDPR) is the exercise of a fundamental right to be aware of and verify the lawfulness of the processing of personal data about oneself.

That this is a fundamental right is emphasised by the range of enforcement powers available to the Information Commissioner’s Office (ICO), against those controllers who fail to comply with their obligations in response to an access request. These include the power to serve administrative fines to a maximum amount of €20m, but, more prosaically, the power to order the controller to comply with the data subject’s requests to exercise his or her rights. This, surely, is a basic function of the ICO – the sort of regulatory action which underlines its existence. This, much more than operating regulatory sandboxes, or publishing normative policy papers, is surely what the ICO is fundamentally there to do.

Yet read this, a letter shown to me recently which was sent by ICO to someone complaining about the handling of an access request:

 

Dear [data subject],

Further to my recent correspondence, I write regarding the way in which [a London Borough] (The Council) has handled your subject access request.

I have contacted the Council and from the evidence they have provided to me, as stated before, it appears that they have infringed your right to access under the GDPR by failing to comply with your SAR request. However, it does not appear as though they are willing to provide you with any further information and we have informed them of our dissatisfaction with this situation.

It is a requirement under the Data protection Act 2018 that we investigate cases to the ‘extent appropriate’ and after lengthy correspondence with the Council, it appears they are no longer willing co-operate with us to provide this information. Therefore, you may have better results if you seek independent legal advice regarding the matters raised in this particular case.

Here we have the ICO telling a data subject that it will not take action against a public authority data controller which has infringed her rights by failing to comply with an access request. Instead, the requester must seek her own legal advice (almost inevitably at her own significant cost).

Other controllers might look at this and wonder whether they should bother complying with the law, if no sanction arises for failing to do so. And other data subjects might look at it and wonder what is the point in exercising their rights, if the regulator will not enforce them.

This is the most stark single example in a collection of increasing evidence that the ICO is failing to perform its basic tasks of regulation and enforcement.

It is just one data subject, exercising her right. But it is a right which underpins data protection law: if you don’t know and can’t find out what information an organisation has about you, then your ability to exercise other rights is stopped short.

The ICO should reboot itself. It should, before and above all else, perform its first statutory duty – to monitor and enforce the application of the GDPR.

I don’t understand why it does not want to do so.

[P.S. I think the situation described here is different, although of the same species, to situations where ICO finds likely non-compliance but declines to take punitive action – such as a monetary penalty. Here, there is a simple corrective regulatory power available – an enforcement notice (essentially a “steps order”) under section 148 Data Protection Act 2018.]

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under access to information, Data Protection, GDPR, human rights, Information Commissioner

ICO – fines, what fines?

No surprise…but ICO has only issued four notices of intent to serve a fine since GDPR came into application (and one fine)

I made a quick Freedom of Information Act (FOIA) request a few weeks ago to the Information Commissioner’s Office (ICO), asking

since 25 May 2018
1) how many notices of intent have been given under paragraph 2(1) of schedule 16 to the Data Protection Act 2018?
2) How many notices of intent given under 1) have not resulted in a monetary penalty notice being given (after the period of 6 months specified in paragraph 2(2) of the same schedule to same Act)?

I have now received (4 September) received a response, which says that four notices of intent only have been issued in that time. Three of those are well known: one was in respect of Doorstep Dispensaree (who have since received an actual fine – the only one issued under GDPR – of £275,000); two are in respect of British Airways and of Marriott Inc., which have become long-running, uncompleted sagas; the identity of the recipient of the final one is not known at the time of writing.

The contrast with some other European data protection authorities is stark: in Spain, around 120 fines have been issued in the same time; in Italy, 26; in Germany (which has separate authorities for its individual regions), 26 also.

Once again, questions must be asked about whether the aim of the legislator, in passing GDPR, to homogenise data protection law across the EU, has been anywhere near achieved.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

Elizabeth Denham and international transfers

One question prompted by the news (original source: 2040training) that Elizabeth Denham, the Information Commissioner, is currently working from her home in Canada, is whether the files and matters she is working on, to the extent they contain or constitute personal data, are being transferred to her in accordance with Chapter 5 of the General Data Protection Regulation (GDPR).

Chapter 5’s provisions mean that personal data can only be transferred to a country outside the European Economic Area in certain circumstances. In general, these boil down to: 1) if the European Commission has made an adequacy determination in respect of the country, 2) if Commission-approved standard contractual clauses are in place, 3) if binding corporate rules are in place, 4) if Article 49 derogations for specific situations are in place.

So, can one play a distracting little parlour game looking at what international transfer mechanism Ms Denham and the Information Commissioner’s Office (ICO) in the UK have adopted? No need, says the ICO. What is going on is not an international transfer of the type envisaged by GDPR.

The ICO’s guidance on the subject introduces the not-unhelpful term “restricted transfers”, to describe those transfers of personal data to which Chapter 5 of GDPR applies. However, it includes in its category of transfers which are not restricted, the following example

if you are sending personal data to someone employed by you or by your company, this is not a restricted transfer. The transfer restrictions only apply if you are sending personal data outside your organisation

So (at least to the extent that she, as Commissioner, is employed by, or embodies, the ICO) transfers of personal data to Ms Denham in Canada are not restricted transfers to which Chapter 5 of GDPR applies. There is, as it were, a corner of a foreign field that is forever Wilmslow.

The basis for the ICO’s position here, though, is not entirely easy to discern, and the position does not appear to be one that is obviously  shared by other data protection authorities, or the European Data Protection Board (unless the latter’s impending guidance on international transfers proves me wrong).

And it does strike me that the ICO’s position is potentially open to abuse. What if, for instance, someone decided to set up a medical data analytics company in the UK, with no UK employees, but a branch office in, say, Syria, employing hundreds of people there, and to where all of medical data it gathered was sent for storage and further processing, would the ICO still take the view that this was not a restricted transfer? Given the intense scrutiny which the CJEU applied to the US surveillance regime in the Schrems litigation, is it really likely that it would agree with a legal approach which resulted in data manifestly being in a state whose laws were deficient, but such data was not protected by the Chapter 5 provisions?

A similar issue might arise with another aspect of the ICO’s guidance, which implies that a transfer to a country outside the EEA, but which is a transfer to a controller to which the GDPR extra-territorial provisions apply, is also not a restricted transfer. If that controller was in, say South Sudan, would the ICO hold its position?

None of this is to say, of course, that the fact that a transfer may not be a restricted one means that all the other GDPR obligations are set aside. They continue to apply, and, no doubt, Ms Denham and the ICO are doing all they can to comply with them.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, Information Commissioner

Why does the UK stop students accessing their mock exam and assignments data?

UPDATE: 23.08.20 In this piece Chris Pounder identifies what the government sees as a justification for the exam scripts exemption. In a document prepared to assist adequacy discussions with the European Commission, it is said that the exemption “aims to protect the integrity of exams by ensuring that exam scripts cannot be accessed outside established processes” (on the basis that exam boards often re-use or re-purpose exam questions). However, and as Chris implies, this simply isn’t sufficient to justify the blanket exemption, not the breadth of its scope. Moreover the ICO’s meek acceptance that it permits an interpretation which even covers assignments and, presumably, other coursework, is deeply disappointing. END UPDATE.

Domestic data protection law says that students can’t later access data recorded by themselves during an exam or assessment. Why is that? And is it compatible with the UK’s obligations under GDPR and more general human rights law?

As is well known, the General Data Protection Regulation (GDPR) has direct effect on member states of the EU. This is, however, subject to certain provisions which allow member states to legislate for specific exemptions or restrictions. An example is Article 23 of GDPR, which allows member states to restrict by way of a legislative measure the scope of certain data subject rights, including the right of access at Article 15. Such restrictions must, though, respect “the essence of the fundamental rights and freedoms” and be a “necessary and proportionate measure in a democratic society” to safeguard, among a list of things, important objectives of general public interest.

The specific UK restrictions made in respect of Article 23 lie primarily in Schedule 2 of the Data Protection Act 2018. Of particular interest at the current time is the Schedule 2, paragraph 25(1) exemption to the Article 15 right of subject access which says that the right does “not apply to personal data consisting of information recorded by candidates during an exam” (and paragraph 25(4) says that “‘exam’ means an academic, professional or other examination used for determining the knowledge, intelligence, skill or ability of a candidate and may include an exam consisting of an assessment of the candidate’s performance while undertaking work or any other activity”).

Thus it is that guidance from the Information Commissioner’s Office (ICO) says, in relation to this year’s exam awards

The exam script exemption applies to information that has been recorded by the students themselves during an exam or assessment. Therefore students do not have a right to get copies of their answers from mock exams or assignments used to assess their performance

But why does this exemption exist? Search me. Why did it also exist in the 1998 Data Protection Act? Also, search me. Also search Hansard, like I have done, and you may struggle to find out. (Please let me know if I’ve missed something).

So in what way can the exam script exemption be said to respect the essence of the fundamental rights and freedoms and be a necessary and proportionate measure in a democratic society? Is this a case where Parliament merely nodded through a provision which it also merely nodded through 22 years ago?

Note that this is not a question as to whether information recorded by candidates during an exam is their personal data. It most certainly is, as the CJEU found in 2017 in Nowak. But note also that the court, in that case, observed that “the use of [such] information, one consequence of [the use of the information] being the candidate’s success or failure at the examination concerned, is liable to have an effect on his or her rights and interests, in that it may determine or influence, for example, the chance of entering the profession aspired to or of obtaining the post sought”. The court also noted, in holding that such information was personal data, the importance of the data subject’s rights of access, rectification and objection.

And let us remember recital 63 GDPR, which reminds us that one purpose of the right of subject access is to be able to “verify the lawfulness of the processing”. In the absence of any indication as to why the UK decided to restrict the right of access in such a way as to prevent students, especially this year’s students, accessing their own assignment and mock exam data, one must query how those students can adequately verify the lawfulness of the processing by those who determined their grades.

P.S. there is an argument that the ICO should do something about this, under its Article 57 tasks to monitor and enforce GDPR, to handle complaints from data subjects, and to advise parliament, the government, and other institutions and bodies. It has the power under Article 58 to issue an opinion to those bodies.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, subject access

BA hints at massively reduced size of ICO proposed fine

A new piece by me on the Mishcon de Reya website – BA’s parent company’s latest financial filings indicate it’s planning for (at most?) a E22m fine.

 

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

High Court – subject access, breach of confidence and the offence of reidentification

An interesting case is being heard in the High Court, arising from an apparent error whereby, in responding to a subject access request (SAR), the London Borough of Lambeth allowed the recipient (and now defendant) data subject to electronically manipulate the information sent to him. This in turn enabled him to remove redactions, and identify someone who had made allegations against him and his wife (about the care they were providing to their child).

This is nightmare scenario for a controller – to inadvertently disclose extremely sensitive information, while responding to a SAR. In this instance, Lambeth have now brought a claim in breach of confidence against the defendant data subject, on the grounds that: the data was provided to the data subject in circumstances where he knew it was confidential; that he breached that confidentiality by unredacting the data, retaining an unredacted copy of the file, using the evidence to write a pre-action letter to the person who made allegations against him and his wife and threatening to bring court proceedings against them based on the information; and that it is integral to the work of Children’s Services that people who bring to its attention instances of perceived inadequate care or neglect of children are able to do so under conditions of confidentiality and can be assured that their confidentiality will be respected.

The instant proceedings were primarily concerned with a strike-out application by the defendant data subject, on the grounds of non-compliance by Lambeth with its (litigation) disclosure obligations. This application was roundly dismissed, and the matter will proceed to trial.

But of particular note is that, notwithstanding that the original error was Lambeth’s, it was revealed in the proceedings that the Information Commissioner’s Office (ICO) is also prosecuting the defendant data subject on charges of committing the offences of knowingly or recklessly re-identifying de-identified personal data, without the consent of the data controller, and knowingly or recklessly processing re-identified personal data, without the consent of the data controller. These are new offences created by sections 171(1) and 171(5) of the Data Protection Act 2018, and, when that Act was passed, it appeared that the mischief the provisions sought to address was the risk of hackers and fraudsters attempting to identify data subjects from large datasets (see the debates at Bill stage). It will be interesting to see if the ICO’s prosecution here results in a conviction. But it will also be interesting to see if ICO considers similar prosecutions in other circumstances. Although there is a public interest defence (among others) to section 171 charges, it is not an uncommon occurrence for public authorities (particularly) to inadvertently disclose or publish information with imperfect redactions. It certainly appears, on a plain reading of section 171, that someone re-identifying de-identified personal data (even if, say, for idle reasons of curiosity) might not always be able to avail themselves of the public interest defence.

And what is unsaid in the judgment, is whether Lambeth are facing any sort of civil, regulatory action from the ICO, arising from their error in sending the imperfectly redacted information in the first place.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under anonymisation, Data Protection, Data Protection Act 2018, Information Commissioner, local government, subject access

Yet more delays to proposed ICO BA and Marriott fines

I have this piece on the Mishcon de Reya website. More than a year since they were first proposed, ICO has still not converted its notices of intent into actual fines. Will it ever?

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

COVID-19 and ICO’s proposed fines for BA and Marriott

I have a piece on the Mishcon de Reya website, questioning whether the Coronavirus might fundamentally affect the likelihood of BA and Marriott receiving huge GDPR fines.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice