Category Archives: Data Protection

Hospital episode data – confidential data uploaded by mistake

Rather hidden away in the new IIGOP annual report is a worrying and revealing report of a serious data breach involving hospital episode data

In February last year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

However, as Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre in June of last year revealed, there had been

lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

As I said at the time

One waits with interest to see whether the [Information Commissioner’s Office (ICO)] will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data

Now, with the launch of the first annual report of the Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott and established at the request of the Secretary of State to “advise, challenge and report on the state of information governance across the health and care system in England”, we see further evidence of HES data “being compromised, the privacy of patients being compromised”. The report informs us of an incident whereby

New inspection procedures introduced by the HSCIC had uncovered a number of organisations which were sending HES data and failing to follow data dictionary standards. This meant they were inadvertently enabling personal confidential data to enter the data base. Following an alert to the Information Commissioners’ Office this was understood as a large scale problem, although having a low level potential impact, as the affected data fields were unknown to either senders or receivers of HES data. The relevant organisations were contacted to gain their cooperation in closing the breach, without alerting any unfriendly observer to the location of the confidential details. This was important to preserve the general ignorance of the detail of the breach and continue to protect individuals’ privacy. Trusts and others were encouraged to provide named contacts who would then start cleaning up their data flows to the HSCIC. In order to manage any untoward reporting in the media, trade titles were informed and briefed about the importance of restricting their reporting to avoid any risk of leading people towards this confidential data.

Now this to me seems pretty serious: those organisations who failed to “follow data dictionary standards” by data controller organisations who were sending HES data sounds very likely to be a contravention of the data controllers’ obligation, under section 4(4) of the Data Protection Act 1998 (DPA) to comply with the seventh data protection principle, which requires that they take

Appropriate technical and organisational measures…against unauthorised or unlawful processing of personal data

Serious contraventions, of a kind likely to cause substantial damage or substantial distress, can result in the ICO serving a monetary penalty notice, under section 55A of the DPA, to a maximum of £500,000.

So, what does one make of these incidents? It’s hard to avoid the conclusion that they would be held to be “serious”, and if the data in question had been misused, there would have been the potential for substantial damage and substantial distress – public disclosure of hospital record data could have a multitude of pernicious effects – and this much is evidenced by the fact that (successful) attempts had to be made to avoid the errors coming to light, including asking journalists to avoid reporting. But were they contraventions likely to cause these things? IIGOP suggests that they had a “low level potential impact” because the data was hidden within large amounts of non-offensive data, and I think it is probably the case that the incidents would not be held to have been likely to cause substantial damage or substantial distress (in Niebel, the leading case on monetary penalty notices, Wikeley J in the Upper Tribunal accepted that the likely in s55A DPA took the same meaning attributed to it by Munby J, in R (Lord) v Secretary of State for the Home Department [2003] EWHC 2073 (Admin), namely “‘likely’ meant something more than ‘a real risk’, i.e. a significant risk, ‘even if the risk falls short of being more probable than not'”).

But a monetary penalty notice is not the only action open to the ICO. He has the power to serve enforcement notices, under s40 DPA, to require data controllers to do, or refrain from doing, specified actions, or to take informal action such as requiring the signing of undertakings (to similar effect). Given that we have heard about these incidents from IIGOP, and in an annual report, it seems unlikely that any ICO enforcement action will be forthcoming. Perhaps that’s correct as a matter of law and as a matter of the exercise of discretion, but in my view the ICO has not been vocal enough about the profound issues raised by the amalgamation and sharing of health data, and the concerns raised by incidents of potentially inappropriate or excessive processing. Care.data of course remains on the agenda, and the IIGOP report is both revealing and encouragingly critical of what has taken place so far, but one would not want a situation to emerge where the ICO took a back seat and allowed IIGOP (which lacks regulatory and enforcement powers) to deal with the issue.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under care.data, Data Protection, data sharing, Information Commissioner, NHS

Data protection implications of sale of Tesco Clubcard company

 

News that Tesco is considering selling its loyalty card business Dunnhumby raises questions about what might happen to cardholders’ personal data

In 1995 the then Chairman of Tesco, Lord MacLaurin, reportedly said to the creators of the Tesco Clubcard scheme

What scares me about this is that you know more about my customers after three months than I know after 30 years.

Since then the sophistication and power of data analytics have increased exponentially and Dunnhumby claims it analyses data from 770 million-plus customers, about 16.5 million of whom are – it seems – Tesco Clubcard members. Dunnhumby, as a data processor for Tesco, processes the personal data of those millions of members, so what happens if the business is sold? Does the customer database also get sold? If so, what are the data protection implications?

Sales of customer databases can be effected lawfully and in compliance with the Data Protection Act 1998 (DPA), as the Information Commissioner’s Office explains in helpful guidance

When a database is sold, the seller must make sure that the buyer understands that they can only use the information for the purposes for which it was collected. Any use of this personal information should be within the reasonable expectations of the individuals concerned. So, when a database is sold, its use should stay the same or similar. For example, if the database contains information obtained for insurance, the database should only be sold to another insurance-based business providing similar insurance products. Selling it to a business for a different use is likely to be incompatible  with the original purpose and likely to go beyond the expectations of the individuals.

The operative words there are, I suggest “expectations of the individuals concerned”. “Reasonable expectations” are strongly linked to the first principle in Schedule One of the DPA, which requires that “personal data shall be processed fairly and lawfully…”. The interpretative provisions in Part II of Schedule One explain that broadly, for processing to be fair, data subjects should be told who is doing the processing, and why. These provisions are the genesis of the “privacy notices” and “privacy policies” which so few of us take the time to read. But their Clubcard privacy policy is where things might become problematic for Tesco in the event that they propose to sell Dunhumby and cardholders’ data. As twitter user @NoDPISigma points out, the Customer Charter says

We would like to reassure you that your personal details are safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

and the separate Privacy and Cookies Policy also says

Your personal information is safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

Although at first blush it is difficult to see that as anything other than an unequivocal promise that cardholders’ personal data will never be sold, the rub is in the phrase “for their marketing purposes”. If the sale of Dunnhumby and cardholders’ data is to another company in order that that other company can continue to operate the Clubcard scheme on behalf of Tesco then, as long as that was all that the data continued to be used for, I don’t think it would be a release of personal data to a company for that company’s marketing purposes. If, however, the purchasing company intended to use the data for its own marketing purposes, then the sale might be a breach of the charter promise – and, in that event, it would be strongly arguable that the sale could give rise to a serious contravention of Tesco’s obligation (at section 4(4) of the DPA) to comply with the fairness principle.

And among those 16.5 million Clubcard holders there are likely to be some awkward so-and-sos who might bring legal challenges in those circumstances.

[This post was edited because in its first draft it failed properly to consider the issue of data controller/processor. Thanks to Rich Greenhill for prompting me into a redraft]

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, marketing

Should victims of “revenge porn” be granted anonymity?

I got into an interesting twitter discussion a few days ago with a journalist who had run a story* about a woman convicted under the Malicious Communications Act 1988 (MCA) for uploading a sex tape involving a former friend of hers. The story named the offender, but also the victim, and I asked Luke Traynor, the Mirror journalist, whether he had considered not naming the latter, who was the victim of what I described as a “sexual crime”.  To his credit, Luke replied, saying that he’d “Checked the law, and she’s not a sexual crime victim, but a victim of malicious communication”.

I think Luke is partly correct – a victim of a section 1 MCA offence is not classed as a victim of a specified sexual offence pursuant to section 2 of the Sexual Offences (Amendment) Act 1992, and is not, therefore, automatically granted lifetime anonymity from the press under section 1. This is the case even where – as here – the crime was a targeted attempt to embarrass or damage the victim on the basis of their sexual behaviour. The Mirror even described this case as one of “Revenge Porn” and, indeed, moves are currently being made to create a specific offence of disclosing private sexual photographs and films with intent to cause distress (clause 33 of the Criminal Justice and Courts Bill refers). If that Bill is passed, I would argue that serious thought should be given to awarding anonymity to victims of this offence.

But merely because statutory anonymity was not available to the victim of the offence reported by the Mirror it does not mean that it was right to name her, and (as you might expect from me) I think that data protection law is in play. Information relating to an identifiable individual’s sexual life is her sensitive personal data, afforded particular protection under the Data Protection Directive 95/46 and the UK Data Protection Act 1998 (DPA) to which it gives domestic effect. Publication of sensitive personal data without one of the conditions in Schedule 3 of the DPA being met (and I cannot see which would be met in this instance) is as a general rule unlawful. There is though, at section 32 of the DPA, as I have written about recently, an effective exemption from most of the Act for personal data processed only for the purposes of journalism. I suspect The Mirror, or any other media outlet naming the victim in this case, would claim this exemption, but it is important to note that, as broad as the exemption is, it can only be claimed if

the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and…the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with [journalism]

I invited Luke to explain whether he thought that publication of the victim’s name was in the public interest, but his reply

It was said in a public court, in accordance with the law, which takes into account ethics and public interest

did not really deal with the section 32 point – just because something was said in public court it does not mean that it is in the public interest to publish it. And unless Luke (or, rather, the Mirror, as data controller) reasonably believed that it was so, the exemption falls away.

Of course, in the absence of any complaint from the individual, all of this might seem otiose. But I think it raises further important issues about the extent of the section 32 exemption, as well as whether there should be some clearer right to privacy for victims of certain types of communications offences.

And, as Tim Turner pointed out, this sort of story shows why some might want to exercise a “right to be forgotten” – if unnecessary and unfair information is published about them on the internet, can some people be blamed for wanting it removed, or made less prominent?

*I have avoided linking directly to the article in question for reasons which should be obvious, given the content of this post. However, it is not difficult to find. That, of course, is the problem. 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under communications offence, Data Protection, Privacy

Are we all journalists?

The ICO has said that Global Witness can claim the data protection exemption for journalism, regarding their investigations in BSGR. This fascinating case continues to raise difficult and important questions.

Data protection law rightly gives strong protection to journalism; this is something that the 2012 Leveson inquiry dealt with in considerable detail, but, as the inquiry’s terms of reference were expressly concerned with “the press”, with “commercial journalism”, it didn’t really grapple with the rather profound question of “what is journalism?” But the question does need to be asked, because in the balancing exercise between privacy and freedom of expression too much weight afforded to one side can result in detriment to the other. If personal privacy is given too much weight, freedom of expression is weakened, but equally if “journalism” is construed too widely, and the protection afforded to journalism is consequently too wide, then privacy rights of individuals will suffer.

In 2008 the Court of Justice of the European Union (CJEU) was asked, in the Satamedia case, to consider the extent of the exemption from a large part of data protection law for processing of personal data for “journalistic” purposes. Article 9 of the European Data Protection Directive (the Directive) provides that

Member States shall provide for exemptions or derogations…for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression.

and recital 37 says

Whereas the processing of personal data for purposes of journalism or for purposes of literary of artistic expression, in particular in the audiovisual field, should qualify for exemption from the requirements of certain provisions of this Directive in so far as this is necessary to reconcile the fundamental rights of individuals with freedom of information and notably the right to receive and impart information

In Satamedia one of the questions the CJEU was asked to consider was whether the publishing of public-domain taxpayer data by two Swedish companies could be “regarded as the processing of personal data carried out solely for journalistic purposes within the meaning of Article 9 of the directive”. To this, the Court replied “yes”

Article 9 of Directive 95/46 is to be interpreted as meaning that the activities [in question], must be considered as activities involving the processing of personal data carried out ‘solely for journalistic purposes’, within the meaning of that provision, if the sole object of those activities is the disclosure to the public of information, opinions or ideas [emphasis added]

One can see that, to the extent that Article 9 is transposed effectively in domestic legislation, it affords significant and potentially wide protection for “journalism”. In the UK it is transposed as section 32 of the Data Protection Act 1998 (DPA). This provides that

Personal data which are processed only for the special purposes are exempt from any provision to which this subsection relates if—

(a)the processing is undertaken with a view to the publication by any person of any journalistic, literary or artistic material,

(b)the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and

(c)the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with the special purposes.

where “the special purposes” are one or more of “the purposes of journalism”, “artistic purposes”, and “literary purposes”. Section 32 DPA exempts data processed for the special purposes from all of the data protection principles (save the 7th, data security, principle) and, importantly from provisions of sections 7 and 10. Section 7 is the “subject access” provision, and normally requires a data controller, upon receipt of written request by an individual, to inform them if their personal data is being processed, and, if it is, to give the particulars and to “communicate” the data to the individual. Section 10 broadly allows a data subject to object to processing which is likely to cause substantial damage or substantial distress, and to require the data to controller to cease (or not begin) processing (and the data controller must either comply or state reasons why it will not). Personal data processed for the special purposes are, therefore, exempt from subject access and from the right to prevent processing likely to cause damage or distress. It is not difficult to see why – if the subject of, say, investigative journalism, could find out what a journalist was doing, and prevent her from doing it, freedom of expression would be inordinately harmed.

The issue of the extent of the journalistic data protection exemption came into sharp focus towards the end of last year, when Benny Steinmetz and three other claimants employed by or associated with mining and minerals group Benny Steinmetz Group Resources (BSGR) brought proceedings in the High Court under the DPA seeking orders that would require campaigning group Global Witness to comply with subject access requests by the claimants, and to cease processing their data. The BSGR claimants had previously asked the Information Commissioner’s Office (ICO), pursuant to the latter’s duties under section 42 DPA, to assess the likelihood of the lawfulness of Global Witness’s processing, and the ICO had determined that it was unlikely that Global Witness were complying with their obligations under the DPA.

However, under section 32(4) DPA, if, in any relevant proceedings, the data controller claims (or it appears to the court) that the processing in question was for the special purposes and with a view to publication, the court must stay the proceedings in order for the ICO to consider whether to make a specific “special purposes” determination by the ICO. Such a determination would be (under section 45 DPA) that the processing was not for the special purposes nor was it with a view to publication, and it would result in a “special information notice”. Such a stay was applied to the BSGR proceedings and, on 15 December, after some considerable wait, the ICO conveyed to the parties that it was “satisfied that Global Witness is only processing the personal data requested … for the purposes of journalism”. Accordingly, no special information notice was served, and the proceedings remain stayed. Although media reports (e.g. Guardian and Financial Times) talk of appeals and tribunals, no direct appeal right exists for a data subject in these circumstances, so, if as seems likely, BSGR want to revive the proceedings, they will presumably either have to apply to have the stay lifted or/and issue judicial review proceedings against the ICO.

The case remains fascinating. It is easy to applaud a decision in which a plucky environmental campaign group claims journalistic data protection exemption regarding its investigations of a huge mining group. But would people be so quick to support, say, a fascist group which decided to investigate and publish private information about anti-fascist campaigners? Could that group also gain data protection exemption claiming that the sole object of their processing was the disclosure to the public of information, opinions or ideas? Global Witness say that

The ruling confirms that the Section 32 exemption for journalism in the Data Protection Act applies to anyone engaged in public-interest reporting, not just the conventional media

but it is not immediately clear from where they import the “public-interest” aspect – this does not appear, at least not in explicit terms, in either the Directive or the DPA. It is possible that it can be inferred, when one considers that processing for special purposes which is not in the public interest might constitute an interference with respect for data subjects’ fundamental rights and freedoms (per recital 2 of the Directive). And, of course, with talk about public interest journalism, we walk straight back into the arguments provoked by the Leveson inquiry.

Furthermore, one notes that the Directive talks about exemption for processing of personal data carried out solely for journalistic purposes, and the DPA says “personal data which are processed only for the special purposes are exempt…”. This was why I emphasised the words in the Satamedia judgment quoted above, which talks similarly of the exemption applying if the “sole object of those activities is the disclosure to the public of information, opinions or ideas”. One might ask whether a campaigning group’s sole or only purpose for processing personal data is for journalism. Might they not, in processing the data, be trying to achieve further ends? Might, in fact, one say that the people who engage solely in the disclosure to public of information, opinions or ideas are in fact those we more traditionally think of in these terms…the press, the commercial journalists?

P.S. Global Witness have uploaded a copy of the ICO’s decision letter. This clarifies that the latter was satisfied that the former was processing for the special purposes because it was part of “campaigning journalism” even though the proposed future publication of the information “forms part of a wider campaign to promote a particular cause”. This chimes with the ICO’s data protection guidance for the media, but it will be interesting if it is challenged on the basis that it doesn’t support a view that the processing is “only” or “solely” for the special purposes.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

5 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner, journalism, Leveson

Hidden data in FOI disclosures

The Hackney Gazette reports that details of 15,000 residents have been published on the internet after Hackney Council apparently inadvertently disclosed the data when responding to a Freedom of Information (FOI) request made using the WhatDoTheyKnow site.

This is not the first time that such apparently catastrophic inadvertent disclosures have happened through WhatDoTheyKnow, and, indeed, in 2012 MySociety, who run the site, issued a statement following a similar incident with Islington Council. As that made clear

responses sent via WhatDoTheyKnow are automatically published online without any human intervention – this is the key feature that makes this site both valuable and popular

It is clearly the responsibility of the authorities in question to ensure that no hidden or exempt information is included in FOI disclosures via WhatDoTheyKnow, or indeed, in FOI disclosures in general. A failure to have appropriate organisational and technical safeguards in place can lead to enforcement action by the Information Commissioner’s Office for contraventions of the Data Protection Act 1998 (DPA): Islington ended up with a monetary penalty notice of £70,000 for their incident, which involved 2000 people. Although the number of data subjects involved is not the only factor the ICO will take into account when deciding what action to take, it is certainly a relevant one: 15000 affected individuals is a hell of a lot.

What concerns me is this sort of thing keeps happening. We don’t know the details of this incident yet, but with such large numbers of data subjects involved it seems likely that it will have involved some sort of dataset, and I would not be at all surprised if it involved purportedly masked or hidden data, such as in a pivot table [EDIT – I’m given to understand that this incident involved cached data in MS Excel]. Around the time of the Islington incident the ICO’s Head of Policy Steve Wood published a blog post drawing attention to the risks. A warning also takes the form of a small piece on a generic page about request handling, which says

take care when using pivot tables to anonymise data in a spreadsheet. The spreadsheet will usually still contain the detailed source data, even if this is hidden and not immediately visible at first glance. Consider converting the spreadsheet to a plain text format (such as CSV) if necessary.

This is fine, but does it go far enough? Last year I wrote on the Guardian web site, and called for greater efforts to be made to highlight the issue. I think that what I wrote then still holds

The ICO must work with the government to offer advice direct to chief executives and those reponsible for risk at councils and NHS bodies (and perhaps other bodies, but these two sectors are probably the highest risk ones). So far these disclosure errors do not appear to have led to harm to those individuals whose private information was compromised, but, without further action, I fear it is only a matter of time.

Time will tell whether this Hackney incident results in a finding of DPA contravention, and ICO enforcement, but in the interim I wish the word would get spread around about how to avoid disclosing hidden data in spreadsheets.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, Freedom of Information, Information Commissioner, monetary penalty notice

FOI disclosure of personal data: balancing of interests

In June this year I blogged about the case of AB v A Chief Constable (Rev 1) [2014] EWHC 1965 (QB). In that case, Mr Justice Cranston had held that, when determining whether personal data is being or has been processed “fairly” (pursuant to the first principle of Schedule One of the Data Protection Act 1998 (DPA))

assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I was surprised by this reading in of an interests balance to the first principle, and said so in my post. Better people than I disagreed, and I certainly am even less sure now than I was of the correctness of my view.

In any case, the binding authority of the High Court rather trumps my meanderings, and it is cited in a recent decision of the First-tier Tribunal (Information Rights) in support of a ruling that the London Borough of Merton Council must disclose, under the Freedom of Information Act 2000 (FOIA), an email sent to a cabinet member of that council by Stephen Hammond MP. The Tribunal, in overturning the decision of the Information Commissioner, considered the private interests of Mr Hammond, including the fact that he had objected to the disclosure, but felt that these did not carry much weight:

we do not consider anything in the requested information to be particularly private or personal and that [sic] this substantially weakens the weight of interest in nondisclosure…We accept that Mr Hammond has objected to the disclosure, which in itself carries some weight as representing his interests. However, asides from an expectation of a general principle of non-disclosure of MP correspondence, we have not been given any reason for this. We have been given very little from the Commissioner to substantiate why Members of Parliament would have an expectation that all their correspondence in relation to official work remain confidential

and balanced against these were the public interests in disclosure, including

no authority had been given for the statement [in the ICO’s decision notice] that MPs expect that all correspondence to remain confidential…[;]…withholding of the requested information was not compatible with the principles of accountability and openness, whereby MPs should subject themselves to public scrutiny, and only withhold information when the wider public interest requires it…[;]…the particular circumstances of this case [concerning parking arrangements in the applicant’s road] made any expectation of confidentiality unreasonable and strongly indicated that disclosure would be fair

The arguments weighed, said the Tribunal, strongly in favour of disclosure.

A further point fell to be considered, however: for processing of personal data to be fair and lawful (per the first data protection principle) there must be met, beyond any general considerations, a condition in Schedule Two DPA. The relevant one, condition 6(1) requires that

The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject

It has to be noted that “necessary” here in the DPA imports a human rights proportionality test and it “is not synonymous with ‘indispensable’…[but] it implies the existence of a ‘pressing social need'” (The Sunday Times v United Kingdom (1979) 2 EHRR 245). The Tribunal, in what effectively was a reiteration of the arguments about general “fairness”, accepted that the condition would be met in this case, citing the applicant’s arguments, which included the fact that

disclosure is necessary to meet the public interest in making public what Mr Hammond has said to the Council on the subject of parking in Wimbledon Village, and that as an elected MP, accountable to his constituents, disclosure of such correspondence cannot constitute unwarranted prejudice to his interests.

With the exception of certain names within the requested information, the Tribunal ordered disclosure.  Assessing “fairness” now, following Mr Justice Cranston, and not following me, clearly does involve balancing the interests of the data subject against the public interest in disclosure.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner, Information Tribunal

The wrong test for anonymisation?

UPDATE: 23.01.15 The ICO has responded [.doc file] to my request for a review of their decision. I drew their attention to the arguments on this page but they don’t even mention them, let alone provide a counter-analysis, in dismissing my complaints (“Having reviewed the matter, I agree with the explanations provided”). I am invited by the ICO to consider taking my own legal action. I understand that the ICO and I might have differing views on a DPA matter, but what I find difficult to accept is the refusal even to enter into a discussion with me about the detailed arguments I’ve made. END UPDATE

In February this year I asked the Information Commissioner’s Office (ICO) to investigate reports that Hospital Episode Statistics (HES) data had apparently been sold to an actuarial society by the NHS Information Centre (NHSIC), the predecessor to the Health and Social Care Information Centre (HSCIC). Specifically I requested, as a data subject can under s42 of the Data Protection Act 1998 (DPA), that the ICO assess whether it was likely or not that the processing of my personal data by NHSIC and others had been in compliance with the DPA.

Nine months later, I was still awaiting the outcome. But a clue to how the assessment would turn out was contained in the text of Sir Nick Partridge’s six month review of various data releases by NHSIC (his original report in June seemed to me to point to multiple potential DPA contraventions). In the review document he says

Six investigations have been separately instigated by the HSCIC or Information Commissioner’s Office (ICO)and shared with both parties as these focussed on whether individuals were at risk of being identified. In the cases it has investigated, the ICO has upheld the HSCIC approach and informed us that it has “seen no evidence to suggest that re-identification has occurred or is reasonably likely to occur.”
And sure enough, after chasing the ICO for the outcome of my nine-month wait, I received this (in oddly formatted text, which rather whiffed of a lot of cutting-and-pasting)
Following the recent issue regarding HSCIC, PA Consulting, and Google we investigated the issue of whether HES data could be considered personal data. This detailed work involved contacting HSCIC, PA Consulting, and Google and included the analysis of the processes for the extraction and disclosure of HES data both generally and in that case in particular. We concluded that we did not consider that the HES dataset constitutes personal data.Furthermore we also investigated whether this information had been linked to other data to produce “personal data” which was subject to the provisions of the Act. We have no evidence that there has been any re-identification either on the part of PA Consulting or Google. We also noted that HSCIC have stated that the HES dataset does not include individual level patient data even at a pseudonymised level. Our view is that the data extracted and provided to PA Consulting did not identify any individuals and there was no reasonable likelihood that re-identification would be possible.
I have added the emphasis to the words “reasonable likelihood” above. They appear in similar terms in the Partridge Review, and they struck me as rather odd. An awful lot of analysis has taken and continues to take place on the subject of when can personal data be “rendered fully anonymous in the sense that it is information from which the data subject is no longer identifiable” (Lord Hope’s dicta in Common Services Agency v Scottish Information Commissioner [2008] UKHL 47). Some of that analysis has been academic, some takes the form of “soft law” guidance, for instance Opinion 05/2014 of the Article 29 Working Party, and the ICO Anonymisation Code of Practice. The former draws on the Data Protection Directive 95/46/EC, and notes that

Recital 26 signifies that to anonymise any data, the data must be stripped of sufficient elements such that the data subject can no longer be identified. More precisely, that data must be processed in such a way that it can no longer be used to identify a natural person by using “all the means likely reasonably to be used”

Anonymisation has also been subject to judicial analysis, notably in the Common Services Agency case, but, even more key, in the judgment of Mr Justice Cranston in Department of Health v Information Commissioner ([2011] EWHC 1430). The latter case, involving the question of disclosure of late-term abortion statistics, is by no means an easy judgment to parse (ironically so, given that it makes roughly the same observation of the Common Services Agency case). The judge held that the First-tier Tribunal had been wrong to say that the statistics in question were personal data, but that it had on the evidence been entitled to say that “the possibility of identification by a third party from these statistics was extremely remote”. The fact that the possibility of identification by a third party was extremely remote meant that “the requested statistics were fully anonymised” (¶55). I draw from this that for personal data to be anonymised in statistical format the possibility of identification of individuals by a third party must be extremely remote. The ICO’s Anonymisation Code, however, says of the case:

The High Court in the Department of Health case above stated that the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA [emphasis added]

But this seems to me to be an impermissible description of the case – the High Court did not state what the ICO says it stated – the phrases “greater than remote” and “reasonably likely” do not appear in the judgment. And that phrase “reasonably likely” is one that, as I say, makes it way into the Partridge Review, and the ICO’s assessment of the lawfulness of HES data “sale”.

I being to wonder if the ICO has taken the phrase from recital 26 of the Directive, which talks about the need to consider “all the means likely reasonably to be used” to identify an individual, and transformed it into a position from which, if identification is not reasonably likely, it will accept that data are anonymised. This cannot be right: there is a world of difference between a test which considers whether possibility of identification is “extremely remote” and whether it is “reasonably likely”.

I do not have a specific right to a review of the section 42 assessment decision that the processing of my personal data was likely in compliance with NHSIC’s obligations under the DPA, but I have asked for one. I am aware of course that others complained (après moi, la deluge) notably, in March, FIPR, MedConfidential and Big Brother Watch . I suspect they will also be pursuing this.

In October this year I attended an event at which the ICO’s Iain Bourne spoke. Iain was a key figure in the drawing up of the ICO’s Anonymisation Code, and I took the rather cheeky opportunity to ask about the HES investigations. He said that his initial view was that NHSIC had been performing good anonymisation practice. This reassured me at the time, but now, after considering this question of whether the Anonymisation Code (and the ICO) adopts the wrong test on the risks of identification, I am less reassured. Maybe “reasonably likely that an individual can be identified” is an appropriate test for determining when data is no longer anonymised, and becomes personal data, but it does not seem to me that the authorities support it.

Postscript Back in August of this year I alerted the ICO to the fact that a local authority had published open data sets which enabled individuals to be identified (for instance, social care and housing clients). More than four months later the data is still up (despite the ICO saying they would raise the issue with the council): is this perhaps because the council has argued that the risk of identification is not “reasonably likely”?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under anonymisation, care.data, Data Protection, Directive 95/46/EC, Information Commissioner, NHS

Russell Brand and the domestic purposes exemption in the Data Protection Act

Was a now-deleted tweet by Russell Brand, revealing a journalist’s private number, caught by data protection law?

Data protection law applies to anyone who “processes” (which includes “disclosure…by transmission”) “personal data” (data relating to an identifiable living individual) as a “data controller” (the person who determines the purposes for which and the manner in which the processing occurs). Rather dramatically, in strict terms, this means that most individuals actually and regularly process personal data as data controllers. And nearly everyone would be caught by the obligations under the Data Protection Act 1998 (DPA), were it not for the exemption at section 36. This provides that

Personal data processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes) are exempt from the data protection principles and the provisions of Parts II and III

Data protection nerds will spot that exemption from the data protection principles and Parts II and III of the DPA is effectively an exemption from whole Act. So in general terms individuals who restrict their processing of personal data to domestic purposes are outwith the DPA’s ambit.

The extent of this exemption in terms of publication of information on the internet is subject to some disagreement. On one side is the Information Commissioner’s Office (ICO) who say in their guidance that it applies when an individual uses an online forum purely for domestic purposes, and on the other side are the Court of Justice of the European Union (and me) who said in the 2003 Lindqvist case that

The act of referring, on an internet page, to various persons and identifying them by name or by other means, for instance by giving their telephone numberconstitutes ‘the processing of personal data…[and] is not covered by any of the exceptionsin Article 3(2) of Directive 95/46 [section 36 of the DPA transposes Article 3(2) into domestic law]

Nonetheless, it is clear that publishing personal data on the internet for reasons not purely domestic constitutes an act of processing to which the DPA applies (let us assume that the act of publishing was a deliberate one, determined by the publisher). So when the comedian Russell Brand today decided to tweet a picture of a journalist’s business card, with an arrow pointing towards the journalist’s mobile phone number (which was not, for what it’s worth, already in the public domain – I checked with a Google search) he was processing that journalist’s personal data (note that data relating to an individual’s business life is still their personal data). Can he avail himself of the DPA domestic purposes exemption? No, says the CJEU, of course, following Lindqvist. But no, also, would surely say the ICO: this act by Brand was not purely domestic. Brand has 8.7 million twitter followers – I have no doubt that some will have taken the tweet as an invitation to call the journalist. It is quite possible that some of those calls will be offensive, or abusive, or even threatening.

Whilst I have been drafting this blog post Brand has deleted the tweet: that is to his credit. But of course, when you have so many millions of followers, the damage is already done – the picture is saved to hard drives, is mirrored by other sites, is emailed around. And, I am sure, the journalist will have to change his number, and maybe not much harm will have been caused, but the tweet was nasty, and unfair (although I have no doubt Brand was provoked in some way). If it was unfair (and lacking a legal basis for the publication) it was in contravention of the first data protection principle which requires that personal data be processed fairly and lawfully and with an appropriate legitimating condition. And because – as I submit –  Brand cannot plead the domestic purposes exemption, it was in contravention of the DPA. However, whether the journalist will take any private action, and whether the ICO will take any enforcement action, I doubt.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner, journalism, social media

Naming and shaming the innocent

Around this time last year I wrote two blog posts about two separate police forces’ decision to tweet the names of drivers charged (but not – yet, at least – convicted) of drink driving offences. In the latter example Staffordshire police were actually using a hashtag #drinkdriversnamedontwitter, and I argued that

If someone has merely been charged with an offence, it is contrary to the ancient and fundamental presumption of innocence to shame them for that fact. Indeed, I struggle to understand how it doesn’t constitute contempt of court to do so, or to suggest that someone who has not been convicted of drink-driving is a drink driver. Being charged with an offence does not inevitably lead to conviction. I haven’t been able to find statistics relating to drink-driving acquittals, but in 2010 16% of all defendants dealt with by magistrates’ courts were either acquitted or not proceeded against

The Information Commissioner’s Office investigated whether there had been a breach of the first principle of Schedule One of the Data Protection Act 1998 (DPA), which requires that processing of personal data be “fair and lawful”, but decided to take no action after Staffs police agreed not to use the hashtag again, saying

Our concern was that naming people who have only been charged alongside the label ‘drink-driver’ strongly implies a presumption of guilt for the offence. We have received reassurances from Staffordshire Police the hashtag will no longer be used in this way and are happy with the procedures they have in place. As a result, we will be taking no further action.

But my first blog post had raised questions about whether the mere naming of those charged was in accordance with the same DPA principle. Newspaper articles talked of naming and “shaming”, but where is the shame in being charged with an offence? I wondered why Sussex police didn’t correct those newspapers who attributed the phrase to them.

And this year, Sussex police, as well as neighbouring Surrey, and Somerset and Avon are doing the same thing: naming drivers charged with drink driving offences on twitter or elsewhere online. The media happily describe this as a “naming and shaming” tactic, and I have not seen the police disabusing them, although Sussex police did at least enter into a dialogue with me and others on twitter, in which they assured us that their actions were in pursuit of open justice, and that they were not intending to shame people. However, this doesn’t appear to tally with the understanding of the Sussex Police and Crime Commissioner who said earlier this year

I am keen to find out if the naming and shaming tactic that Sussex Police has adopted is actually working

But I also continue to question whether the practice is in accordance with police forces’ obligations under the DPA. Information relating to the commission or alleged commission by a person of an offence is that person’s sensitive personal data, and for processing to be fair and lawful a condition in both of Schedule Two and, particularly, Schedule Three must be met. And I struggle to see which Schedule Three condition applies – the closest is probably

The processing is necessary…for the administration of justice
But “necessary”, in the DPA, imports a proportionality test of the kind required by human rights jurisprudence. The High Court, in the MPs’ expenses case cited the European Court of Human Rights, in The Sunday Times v United Kingdom (1979) 2 EHRR 245  to the effect that

while the adjective “necessary”, within the meaning of article 10(2) [of the European Convention on Human Rights] is not synonymous with “indispensable”, neither has it the flexibility of such expressions as “admissible”, “ordinary”, “useful”, “reasonable” or “desirable” and that it implies the existence of a “pressing social need.”
and went on to hold, therefore that “necessary” in the DPA

should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
So is there a pressing social need to interfere with the rights of people charged with (and not convicted of) an offence, in circumstances where the media and others portray the charge as a source of shame? Is it proportionate and fairly balanced to do so? One consideration might be whether the same police forces name all people charged with an offence. If the intent is to promote open justice, then it is difficult to see why one charging decision should merit online naming, and others not.But is the intent really to promote open justice? Or is it to dissuade others from drink-driving? Supt Richard Corrigan of Avon and Somerset police says

This is another tool in our campaign to stop people driving while under the influence of drink or drugs. If just one person is persuaded not to take to the road as a result, then it is worthwhile as far as we are concerned.

and Sussex police’s Chief Inspector Natalie Moloney says

I hope identifying all those who are to appear in court because of drink or drug driving will act as a deterrent and make Sussex safer for all road users

which firstly fails to use the word “alleged” before “drink or drug driving”, and secondly – as Supt Corrigan – suggests the purpose of naming is not to promote open justice, but rather to deter drink drivers.

Deterring drink driving is certainly a worthy public aim (and I stress that I have no sympathy whatsoever with those convicted of such offences) but should the sensitive personal data of who have not been convicted of any offence be used to their detriment in pursuance of that aim?

I worry that unless such naming practices are scrutinised, and challenged when they are unlawful and unfair, the practice will spread, and social “shame” will be encouraged to be visited on the innocent. I hope the Information Commissioner investigates.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, human rights, Information Commissioner, Open Justice, police, social media

ICO confirm they are considering enforcement action over #samaritansradar app

FOI response from ICO refuses disclosure of correspondence with Samaritans because it could prejudice ongoing investigations

On 12 November I asked the Information Commissioner’s Office to disclose to me, under the Freedom of Information Act (FOIA) information relating to their assessment of the legality of the “Samaritans Radar” app (see blog posts passim).

The ICO have now responded to me, refusing to disclose because of the FOIA exemption for “law enforcement”. As the ICO say

The exemption at section 31(1)(g) of the FOIA refers to circumstances
where the disclosure of information “would, or would be likely to,
prejudice – … the exercise by any public authority of its functions for
any of the purposes specified in subsection (2).”

The purposes referred to in sections 31(2)(a) and (c) are –

“(a) the purpose of ascertaining whether any person has failed to comply
with the law” and

“(c) the purpose of ascertaining whether circumstances which would
justify regulatory action in pursuance of any enactment exist or may arise
…”

Clearly, these purposes apply when the Information Commissioner is
considering whether or not an organisation has breached the Data Protection Act

But the exemption is subject to a public interest test, and the ICO acknowledge that there is public interest in the matter, particularly in how Samaritans have responded to their enquiries. Nonetheless, as the investigation is ongoing, and as no decision has apparently been made about whether enforcement action should be taken, the balance in the public interest test falls on the side of non-disclosure.

The question of potential enforcement action is an interesting one. Although the ICO have power to serve monetary penalty notices (to a maximum of £500,000) they can also issue enforcement notices, requiring organisations (who are data controllers, as I maintain Samaritans were for the app) to cease or not begin processing personal data for specific purposes. They also can ask data controllers to sign undertakings to take or not take specific action. This is of interest because Samaritans have indicated that they might want to launch a reworked version of the app.

It is by no means certain that enforcement action will result – the ICO are likely to be reluctant to enforce against a generally admirable charity – but the fact that it is being considered is in itself of interest.

The ICO acknowledge that the public interest in maintaining this particular exemption wanes once the specific investigation has been completed. Consequently I have asked them, outwith FOIA, to commit to disclosing this information proactively once the investigation has finished. They have no obligation to do so, but it would be to the benefit of public transparency, which their office promotes, if they did.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, enforcement, Freedom of Information, Information Commissioner