Category Archives: Data Protection Act 2018

An Uber-reaction in The Times

“Uber gives police private data on drivers and passengers” announces The Times(£) this morning.

In this post, much to my surprise (I have never taken an Uber, and don’t intend to – I don’t like their business model), I come to the defence of Uber.

A closer read of the Times piece reveals that what is being referred to, in documents filed with the High Court, in proceedings regarding TfL’s refusal to renew Uber’s licence, is requests to Uber from the police to disclose personal data for the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders.

Such requests are commonly made to thousands of public authorities and private companies. They used to be known in data protection and police circles as “section 29 requests”, after the relevant section of the now-repealed Data Protection Act 1998. The term was a bit misleading: section 29, now replaced effectively by paragraph 2 of Schedule 2 to the Data Protection Act 2018, has the effect of disapplying the provisions of data protection law which would otherwise prevent the disclosure of personal data to the police (or others), and where not disclosing would be likely to prejudice the purposes of the prevention and detection of crime or the apprehension or prosecution of offenders. This is a necessary provision of data protection law, and provided that (as with all provisions) it is applied correctly and proportionately, it works very well: it gives controller the power to disclose personal data to the police where it is necessary for criminal justice.

If Uber are dealing with police requests appropriately, it is for the public good that personal data which assists the police to investigate drug transporting and human trafficking is made available to them.

In fact, I strongly suspect that The Times will receive such requests from the police. When the requests are related to the paper’s journalistic activities they are probably, and probably rightfully, refused, but they may well get requests in respect of their employees’ data, and I would be very surprised if they don’t sometimes – as a responsible company – comply with these.

Transport for London certainly receives such requests. Indeed, as a public authority, under its transparency measures, it has habitually made statistics on this public. The most recent publication I can find shows that 2012 to 2017 TfL received an average of approximately 10,000 requests each year.

Will The Times now report that TfL is handing over to the police thousands of pieces of intelligence on members of the public each year?

Leave a comment

Filed under Data Protection, Data Protection Act 2018, data sharing, police

ICO – fines, what fines?

No surprise…but ICO has only issued four notices of intent to serve a fine since GDPR came into application (and one fine)

I made a quick Freedom of Information Act (FOIA) request a few weeks ago to the Information Commissioner’s Office (ICO), asking

since 25 May 2018
1) how many notices of intent have been given under paragraph 2(1) of schedule 16 to the Data Protection Act 2018?
2) How many notices of intent given under 1) have not resulted in a monetary penalty notice being given (after the period of 6 months specified in paragraph 2(2) of the same schedule to same Act)?

I have now received (4 September) received a response, which says that four notices of intent only have been issued in that time. Three of those are well known: one was in respect of Doorstep Dispensaree (who have since received an actual fine – the only one issued under GDPR – of £275,000); two are in respect of British Airways and of Marriott Inc., which have become long-running, uncompleted sagas; the identity of the recipient of the final one is not known at the time of writing.

The contrast with some other European data protection authorities is stark: in Spain, around 120 fines have been issued in the same time; in Italy, 26; in Germany (which has separate authorities for its individual regions), 26 also.

Once again, questions must be asked about whether the aim of the legislator, in passing GDPR, to homogenise data protection law across the EU, has been anywhere near achieved.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

Why does the UK stop students accessing their mock exam and assignments data?

UPDATE: 23.08.20 In this piece Chris Pounder identifies what the government sees as a justification for the exam scripts exemption. In a document prepared to assist adequacy discussions with the European Commission, it is said that the exemption “aims to protect the integrity of exams by ensuring that exam scripts cannot be accessed outside established processes” (on the basis that exam boards often re-use or re-purpose exam questions). However, and as Chris implies, this simply isn’t sufficient to justify the blanket exemption, not the breadth of its scope. Moreover the ICO’s meek acceptance that it permits an interpretation which even covers assignments and, presumably, other coursework, is deeply disappointing. END UPDATE.

Domestic data protection law says that students can’t later access data recorded by themselves during an exam or assessment. Why is that? And is it compatible with the UK’s obligations under GDPR and more general human rights law?

As is well known, the General Data Protection Regulation (GDPR) has direct effect on member states of the EU. This is, however, subject to certain provisions which allow member states to legislate for specific exemptions or restrictions. An example is Article 23 of GDPR, which allows member states to restrict by way of a legislative measure the scope of certain data subject rights, including the right of access at Article 15. Such restrictions must, though, respect “the essence of the fundamental rights and freedoms” and be a “necessary and proportionate measure in a democratic society” to safeguard, among a list of things, important objectives of general public interest.

The specific UK restrictions made in respect of Article 23 lie primarily in Schedule 2 of the Data Protection Act 2018. Of particular interest at the current time is the Schedule 2, paragraph 25(1) exemption to the Article 15 right of subject access which says that the right does “not apply to personal data consisting of information recorded by candidates during an exam” (and paragraph 25(4) says that “‘exam’ means an academic, professional or other examination used for determining the knowledge, intelligence, skill or ability of a candidate and may include an exam consisting of an assessment of the candidate’s performance while undertaking work or any other activity”).

Thus it is that guidance from the Information Commissioner’s Office (ICO) says, in relation to this year’s exam awards

The exam script exemption applies to information that has been recorded by the students themselves during an exam or assessment. Therefore students do not have a right to get copies of their answers from mock exams or assignments used to assess their performance

But why does this exemption exist? Search me. Why did it also exist in the 1998 Data Protection Act? Also, search me. Also search Hansard, like I have done, and you may struggle to find out. (Please let me know if I’ve missed something).

So in what way can the exam script exemption be said to respect the essence of the fundamental rights and freedoms and be a necessary and proportionate measure in a democratic society? Is this a case where Parliament merely nodded through a provision which it also merely nodded through 22 years ago?

Note that this is not a question as to whether information recorded by candidates during an exam is their personal data. It most certainly is, as the CJEU found in 2017 in Nowak. But note also that the court, in that case, observed that “the use of [such] information, one consequence of [the use of the information] being the candidate’s success or failure at the examination concerned, is liable to have an effect on his or her rights and interests, in that it may determine or influence, for example, the chance of entering the profession aspired to or of obtaining the post sought”. The court also noted, in holding that such information was personal data, the importance of the data subject’s rights of access, rectification and objection.

And let us remember recital 63 GDPR, which reminds us that one purpose of the right of subject access is to be able to “verify the lawfulness of the processing”. In the absence of any indication as to why the UK decided to restrict the right of access in such a way as to prevent students, especially this year’s students, accessing their own assignment and mock exam data, one must query how those students can adequately verify the lawfulness of the processing by those who determined their grades.

P.S. there is an argument that the ICO should do something about this, under its Article 57 tasks to monitor and enforce GDPR, to handle complaints from data subjects, and to advise parliament, the government, and other institutions and bodies. It has the power under Article 58 to issue an opinion to those bodies.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, subject access

A-levels and data protection – potential challenges?

A new post by me on the Mishcon de Reya website, looking at whether GDPR and the DPA offer the potential for challenges to A-level results.

UPDATE: 14.08.20

A rather odd statement has just been put out by the ICO which suggests that Ofqual have told the former that automated decision making didn’t take place. I’ve updated the Mishcon piece to say this:

The ICO has now issued a statement saying that “Ofqual has stated that automated decision making does not take place when the standardisation model is applied, and that teachers and exam board officers are involved in decisions on calculated grades”. This appears at odds with the statement in Ofqual’s “Privacy Impact Assessment“, which states that the process does involve “automated elements as well as human elements”. Whether this means that the Ofqual standardisation model did not involve “solely” automated decision making will no doubt be determined in the various legal challenges which are apparently currently being mounted.

Oddly, the ICO also says that concerns should be raised with exam boards first, before the ICO will get involved. This does not immediately appear to be in line with the ICO’s obligation to handle complaints, under Article 57 of GDPR (which doesn’t say anything about data subjects having to raise concerns with someone else first).

Leave a comment

Filed under accuracy, Data Protection, Data Protection Act 2018, GDPR, Information Commissioner

BA hints at massively reduced size of ICO proposed fine

A new piece by me on the Mishcon de Reya website – BA’s parent company’s latest financial filings indicate it’s planning for (at most?) a E22m fine.

 

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

High Court – subject access, breach of confidence and the offence of reidentification

An interesting case is being heard in the High Court, arising from an apparent error whereby, in responding to a subject access request (SAR), the London Borough of Lambeth allowed the recipient (and now defendant) data subject to electronically manipulate the information sent to him. This in turn enabled him to remove redactions, and identify someone who had made allegations against him and his wife (about the care they were providing to their child).

This is nightmare scenario for a controller – to inadvertently disclose extremely sensitive information, while responding to a SAR. In this instance, Lambeth have now brought a claim in breach of confidence against the defendant data subject, on the grounds that: the data was provided to the data subject in circumstances where he knew it was confidential; that he breached that confidentiality by unredacting the data, retaining an unredacted copy of the file, using the evidence to write a pre-action letter to the person who made allegations against him and his wife and threatening to bring court proceedings against them based on the information; and that it is integral to the work of Children’s Services that people who bring to its attention instances of perceived inadequate care or neglect of children are able to do so under conditions of confidentiality and can be assured that their confidentiality will be respected.

The instant proceedings were primarily concerned with a strike-out application by the defendant data subject, on the grounds of non-compliance by Lambeth with its (litigation) disclosure obligations. This application was roundly dismissed, and the matter will proceed to trial.

But of particular note is that, notwithstanding that the original error was Lambeth’s, it was revealed in the proceedings that the Information Commissioner’s Office (ICO) is also prosecuting the defendant data subject on charges of committing the offences of knowingly or recklessly re-identifying de-identified personal data, without the consent of the data controller, and knowingly or recklessly processing re-identified personal data, without the consent of the data controller. These are new offences created by sections 171(1) and 171(5) of the Data Protection Act 2018, and, when that Act was passed, it appeared that the mischief the provisions sought to address was the risk of hackers and fraudsters attempting to identify data subjects from large datasets (see the debates at Bill stage). It will be interesting to see if the ICO’s prosecution here results in a conviction. But it will also be interesting to see if ICO considers similar prosecutions in other circumstances. Although there is a public interest defence (among others) to section 171 charges, it is not an uncommon occurrence for public authorities (particularly) to inadvertently disclose or publish information with imperfect redactions. It certainly appears, on a plain reading of section 171, that someone re-identifying de-identified personal data (even if, say, for idle reasons of curiosity) might not always be able to avail themselves of the public interest defence.

And what is unsaid in the judgment, is whether Lambeth are facing any sort of civil, regulatory action from the ICO, arising from their error in sending the imperfectly redacted information in the first place.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under anonymisation, Data Protection, Data Protection Act 2018, Information Commissioner, local government, subject access

Podcast – GDPR two years on

Here’s a podcast I recently recorded with my Mishcon de Reya colleague Adam Rose, looking at some of the issues we think are salient two years after GDPR became directly applicable in the U.K.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR

Yet more delays to proposed ICO BA and Marriott fines

I have this piece on the Mishcon de Reya website. More than a year since they were first proposed, ICO has still not converted its notices of intent into actual fines. Will it ever?

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

COVID-19 and ICO’s proposed fines for BA and Marriott

I have a piece on the Mishcon de Reya website, questioning whether the Coronavirus might fundamentally affect the likelihood of BA and Marriott receiving huge GDPR fines.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, GDPR, Information Commissioner, monetary penalty notice

Why the big pause? ICO delay agreed re GDPR fines

On the Mishcon website: ICO agrees delay over GDPR fines with both BA and Marriott

 

Leave a comment

Filed under Data Protection, Data Protection Act 2018, enforcement, GDPR, Information Commissioner, monetary penalty notice