Tag Archives: GDPR

DCMS Statement of Intent on the Data Protection Bill

Not so much a Statement of Intent, as a Statement of the Bleeding Obvious

The wait is not quite over. We don’t yet have a Data Protection Bill, but we do have a Statement of Intent from DCMS, explaining what the proposed legislation will contain. I though it would be helpful to do a short briefing note based on my very quick assessment of the Statement. So here it is

IT’S JUST AN ANNOUNCEMENT OF ALL THE THINGS THE UK WOULD HAVE TO IMPLEMENT ANYWAY UNDER EUROPEAN LAW

By which I mean, it proposes law changes which will be happening in May next year, when the General Data Protection Regulation becomes directly applicable, or changes made under our obligation to implement the Police and Crime Directive. In a little more detail, here are some things of passing interest, none of which is hugely unexpected.

As predicted by many, at page 8 it is announced that the UK will legislate to require parents to give consent to children’s access to information society services (i.e. online services) where the child is under 13 (rather than GDPR’s default 16). As the UK lobbied to give member states discretion on this, it is no surprise.

Exemptions from compliance with majority of data protection law when the processing is for the purposes of journalism will remain (page 19). The Statement says that the government

believe the existing exemptions set out in section 32 strike the right balance between privacy and freedom of expression

But of potential note is the suggestion that

The main difference will be to amend provisions relating to the ICO’s enforcement powers to strengthen the ICO’s ability to enforce the re-enacted section 32 exemptions effectively

Without further details it is impossible to know what will be proposed here, but any changes to the existing regime which might have the effect of decreasing the size of the media’s huge carve-out will no doubt be vigorously lobbied against.

There is confirmation (at pp17 and 18) that third parties (i.e. not just criminal justice bodies) will be able to access criminal conviction information. Again, this is not unexpected – the regime for criminal records checks for employers etc was unlikely to be removed.

The Statement proposes a new criminal offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data, something the Commons Science and Technology Committee has called for. Those who subsequently process such data will also be guilty of an offence. The details here will be interesting to see – as with most privacy-enhancing technology, in order for anonymisation to be robust it needs to stress-tested – such testing will not be effective if those undertaking do so at risk of committing an offence, so presumably the forthcoming Bill will provide for this.

The Bill will also introduce an offence of altering records with intent to prevent disclosure following a subject access request. This will use the current mechanism at section 77 of the Freedom of Information Act 2000. Whether that section itself will be amended (time limits for prosecutions militate against its effectiveness) remains unknown.

I also note that the existing offence of unlawfully obtaining personal data will be widened to those who retain personal data against the wishes of the data controller, even where it was initially obtained lawfully. This will probably cover those situations where people gather or are sent personal data in error, and then refuse to return it.

There is one particular howler at page 21, which suggests the government doesn’t understand what privacy by design and privacy by default mean:

The Bill will also set out to reassure citizens by promoting the concept of “privacy by default and design”. This is achieved by giving citizens the right to know when their personal data has been released in contravention of the data protection safeguards, and also by offering them a clearer right of redress

Privacy by design/default is about embedding privacy protection throughout the lifecycle of a project or process etc., and has got nothing at all to do with notifying data subjects of breaches, and whether this is a drafting error in the Statement, or a fundamental misunderstanding, it is rather concerning that the government, which makes much of “innovation” (around which privacy by design should be emphasised), fails to get this right.

So that’s a whistle stop tour of the Statement, ignoring all the fluff about implementing things which are required under GDPR and the Directive. I’ll update this piece in due course, if anything else emerges from a closer reading.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

10 Comments

Filed under Data Protection, GDPR, Information Commissioner, journalism

A Massive Impact for the ICO?

[Edited to add: it is well worth reading the comments to this piece, especially the ones from Chris Pounder and Reuben Binns]

I needed a way to break a blogging drought, and something that was flagged up to me by a data protection colleague (thanks Simon!) provides a good opportunity to do so. It suggests that the drafting of the GDPR could lead to an enormous workload for the ICO.

The General Data Protection Regulation (GDPR) which entered into force on 24 May this year, and which will apply across the European Union from 25 May 2018, mandates the completion of Data Protection Impact Assessments (DPIAs) where indicated. Article 35 of the GDPR explains that

Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data

In the UK (and indeed elsewhere) we already have the concept of “Privacy Impact Assessments“, and in many ways all that the GDPR does is embed this area of good practice as a legal obligation. However, it also contains some ancillary obligations, one of which is to consult the supervisory authority, in certain circumstances, prior to processing. And here is where I get a bit confused.

Article 36 provides that

The controller shall consult the supervisory authority prior to processing where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk
[emphasis added]

A close reading of Article 36 results in this: if the data controller conducts a DPIA, and is of the view that if mitigating factors were not in place the processing would be high risk, it will have to consult supervisory authority (in the UK, the Information Commissioner’s Office (ICO)). This is odd: it effectively renders any mitigating measures irrelevant. And it appears directly to contradict what recital 84 says

Where a data-protection impact assessment indicates that processing operations involve a high risk which the controller cannot mitigate by appropriate measures in terms of available technology and costs of implementation, a consultation of the supervisory authority should take place prior to the processing [emphasis added]

So, the recital says the obligation to consult will arise where high risk is involved which the controller can’t mitigate, while the Article says the obligation will arise where high risk is involved notwithstanding any mitigation in place.

Clearly, the Article contains the specific legal obligation (the recital purports to set out the reason for the contents of the enacting terms), so the law will require data controllers in the UK to consult the ICO every time a DPIA identifies an inherently high risk processing activity, even if the data controller has measures in place fully to mitigate and contain the risk.

For example, let us imagine the following processing activity – collection of and storage of customer financial data for the purposes of fulfilling a web transaction. The controller might have robust data security measures in place, but Article 36 requires it to consider “what if those robust measures were not in place? would the processing be high risk?” To which the answer would have to be “yes” – because the customer data would be completely unprotected.

In fact, I would submit, if article 36 is given its plain meaning virtually any processing activity involving personal data, where there is an absence of mitigating measures, would be high risk, and create a duty to consult the ICO.

What this will mean in practice remains to be seen, but unless I am missing something (and I’d be delighted to be corrected if so), the GDPR is setting the ICO and other supervisory authorities up for a massive influx of work. With questions already raised about the ICO’s funding going forward, that is the last thing they are likely to need.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, GDPR, Information Commissioner