Tag Archives: DPA

The ICO and records management

The Tribunal is an unusual position in respect of this Appeal…”

The Freedom of Information Act 2000 (FOIA) requires a public authority, when someone makes a request for information, to say whether or not it holds it, and if it does, to disclose that information to the requester (subject to the application of any exemption). But what if it doesn’t know whether it holds it or not? What if, after it has said it can’t find the information, and after the Information Commissioner’s Office (ICO) has accepted this and issued a decision notice upholding the authority’s approach, it then discovers it held it all along? This is the situation the First-tier Tribunal (FTT) recently found itself faced with.

The facts of the case are relatively complex, but the issues turned on whether briefing notes, prepared for the Mayor of Doncaster Metropolitan Borough Council (DMBC) in the lead-up to a decision to withdraw funding for DMBC’s United Nations Day, could be found. The ICO had determined, in Decision Notice FS50503811 that

Ultimately the Commissioner had to decide whether a set of briefing notes were held by the Council. His decision, on the balance of probabilities, is that it does not

The requester appealed to the FTT, which, after initially considering the matter on the papers, ordered an oral hearing because of some apparent inconsistencies in DMBC’s evidence (I have to be frank, what exactly these were is not really clear from the FTT’s judgment (at paragraph 27). However, prior to that oral hearing DMBC located the briefing notes in question, so

the focus of the oral hearing was limited simply to establishing whether, at the time of the information request by the Appellant, DMBC knew that it held the information in the light of the searches that it had made in response to the Information Commissioner’s enquiries prior to his issuing the Decision Notice

In determining that it was satisfied that DMBC did not know, at the time of the request, that it held the information, the FTT was swayed by the fact that DMBC “even during the Information Commissioner’s enquiries, DMBC had maintained it had nothing to gain from ‘hiding’ the briefing notes” but also by the fact that DMBC owned up to poor records management practice in the period leading up to the request

In many senses it is more embarrassing for DMBC now to admit the truth that it had, historically, an unreliable and ineffective Records Management system than to continue to maintain that it could not find the requested information

It doesn’t surprise me that the FTT found as it did. What does surprise me, however, is that records management is not given a greater focus by the ICO. Although FOIA is not, primarily, a records management act, it does contain provisions relating to records management. Powers do exist both to help improve practice both generally (through guidance) and specifically (through the use of practice recommendations). As I’ve written before

section 46 of FOIA [requires] the Lord Chancellor to issue a code of practice for management of records. Section 9 of that Code deals with the need to keep records in systems that enable records to be stored and retrieved as necessary, and section 10 with the need to know what records are held and where they are.

Under section 47 of FOIA the [ICO] must promote the following of good practice by public authorities and perform his functions so as to promote the observance by authorities of the section 46 Code, as well as the requirements of the Act in general. And under section 48 he may issue a “practice recommendation” if it appears to him that the authority has not conformed with the section 46 Code. In investigating compliance with the Code he has the power (section 51) to issue an “information notice” requiring the authority to furnish him with the information. Failure to comply with an information notice can, ultimately, constitute contempt of court.

I appreciate that the ICO has a lot on its hands, but good records management is so very integral not just to good FOIA compliance, but also to good compliance with the other major statute the ICO oversees – the Data Protection Act 1998. Greater focus on records management could drive better overall compliance with information rights law.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner, records management

The monetary penalty notice is in the post

UPDATE: 29.01.15 The BBC now reports that files relating to the role of the police in the deaths of two other members of the public have apparently been “lost in the post”. This starts to look very serious.  END UPDATE

I once heard a rumour that the famous lost HMRC disks of 2007 were not in fact lost after all: the person tasked with posting the disks had, so the rumour went, forgotten to do so, and when the intended recipient, the National Audit Office, had complained, had used the time-honoured excuse “they must be lost in the post”, thinking that this was better than owning up, and that no one would be particularly bothered. I have no idea whether this is true (quite possibly not – the subsequent Poynter report was comprehensive and might have been expected to flush something like that out) but what I think is interesting is that, even if it were, it would not have excused HMRC. The Data Protection Act 1998 (DPA) – which largely languished unloved at the time – requires (by virtue of the seventh principle in Schedule One) a data controller not to prevent specific instances of data loss, but, rather, to take appropriate organisational and technical measures to safeguard against such loss – a contravention of the Act lies in the failure to have these measures in place, not (necessarily) in the failure to prevent a specific incident. The fact that HMRC operated procedures which allowed the sending of huge and excessive amounts of sensitive personal data  by post, without encryption measures being used, meant that HMRC were manifestly in contravention of the DPA.

Fast forward seven years or so to the present, and, we hear, the Ministry of Justice (MoJ) appear to have lost a highly sensitive computer disk in the post. The Mail on Sunday reports that

The Government has been hit by a new data security scandal after a secret file on the fatal shooting of Mark Duggan by police went missing.

A computer disk containing details of the case which triggered Britain’s worst riots in a generation is thought to have been lost in the post by the Ministry of Justice.

Details are, of course, relatively scant at the moment, but it is worth noting that there is no mention of whether the disk in question was encrypted. If it wasn’t, it would be extremely hard for the MoJ to argue that it was in compliance with its DPA obligations: the view of the Information Commissioner (ICO) is that

portable and mobile devices including magnetic media, used to store and transmit personal information, the loss of which could cause damage or distress to individuals, should be protected using approved encryption software which is designed to guard against the compromise of information.

and

where such losses occur and where encryption software has not been used to protect the data, regulatory action may be pursued.

The data protection regulatory landscape was very different in 2007, and the ICO did not then have powers to serve monetary penalty notices. A serious contravention of the DPA of a kind likely to cause substantial damage or substantial distress can now result in a “fine” of up to £500,000.

The ICO is, we are told, “examining the case”. He will, no doubt, be wanting to know not only about encryption measures, but, more simply, what procedures were in place which allowed such sensitive data to be sent by post. He will also, again no doubt, bear in mind that in recent years he has already served on the MoJ, in the last eighteen months, two monetary penalties totalling £320,000 for not dissimilar failures to have appropriate safeguards in place to protect sensitive personal data.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Information Commissioner, Ministry of Justice, monetary penalty notice

UKIP Dartford and data protection compliance

The Telegraph recently highlighted a rather bizarre incident involving the sending of a letter by the secretary of UKIP’s Dartford branch. The letter purports to be from a Simon Blanchard in his capacity as, or as a representative of UKIP Dartford. It appears that Mr Blanchard had taken offence at what he said was a verbal insult directed at him by the recipient of the letter, a Mr Kemp, and chose to write expressing his annoyance both at this, and also expressing his rather extraordinary interpretation of the effect of European Union laws on the UK. But Mr Blanchard did something else – he sent copies of the letter to Mr Kemp’s neighbours. In doing so it is questionable whether Mr Blanchard, and UKIP Dartford, have complied with their obligations under the Data Protection Act 1998 (DPA).

I am presuming that UKIP Dartford is the local constituency association for UKIP. As such, to the extent that it processes personal data of people of identifiable individuals, and determines the purposes for which and the manner in which the processing occurs, it is a data controller. Constituency associations of political parties are distinct from their national parties (they are often at odds with their national parties) and many Labour and Conservative constituency associations recognise this, by registering their processing with the Information Commissioner’s Office (ICO). Indeed, as data controllers not otherwise exempt, they have a legal obligation (section 18 of the Data Protection Act 1998 (DPA)) to do so, and failing to do so, in circumstances where they are processing personal data and cannot avail themselves of an exemption, is a criminal offence (section 21 DPA). I note that UKIP Dartford don’t have an entry on the ICO’s online register – this (and the broader issue of constituency association registration) might be something the ICO should consider investigating.

Furthermore, if it is a data controller, UKIP Dartford will have a statutory obligation (section 4(4) DPA) to comply with the data protection principles. The first of these is that personal data should be processed “fairly and lawfully”. It is not immediately obvious how Blanchard came to have Mr Kemp’s name and address, but, assuming they were gathered lawfully, the sending of the letter itself may well have been fair and lawful. But where problems would be more likely to emerge, I would suggest, would be in the sending by Blanchard of copies of the letter – containing as it did Mr Kemp’s personal data – to neighbours. “Fairness” in the DPA depends a lot on data subjects’ expectations, and it is hard to believe that the recipient of such a letter would have expected it to be circulated among his neighbours.

It is possible that Mr Blanchard came about the name and address details under regulation 105 of the Representation of the People (England and Wales) Regulations 2001 (as amended), whereby local constituency parties may apply for a copy of the full electoral register. It is important to note, however that, by regulation 105(4), the register can only be used for “electoral purposes or the purposes of electoral registration”. Although one can see that “electoral purposes” might be construed broadly, it is difficult to construct an argument that the sending of the copy-letters, containing the original recipient’s personal data, could possibly have been for electoral purposes. For these reasons, a contravention of the second DPA principle would appear to be likely. That principle restricts further processing of personal data in a manner incompatible with the original purposes.

It may be that there is more to this story than is immediately apparent. Perhaps Mr Blanchard and UKIP Dartford acquired Mr Kemp’s data in a different manner. Perhaps they thought they had consent to send it his neighbours (although given that Mr Kemp’s wife complained – and received the peremptory response “There was no error made on the envelope and hope your neighbours had a good read as well” – this seems unlikely). If more details emerge I will update this post, but in the interim, I can say that the story certainly raises questions about DPA compliance.

The forthcoming general election is likely to see battles fought in many fields (I’ve already drawn attention to the possibility that the legal boundaries of electronic marketing may get pushed to the point of breach on these battlegrounds). One hopes that the ICO will be robust enough to deal with the data protection issues which will emerge, which might include excessive or disproportionate use of people’s personal electoral data.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner

Hospital episode data – confidential data uploaded by mistake

Rather hidden away in the new IIGOP annual report is a worrying and revealing report of a serious data breach involving hospital episode data

In February last year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

However, as Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre in June of last year revealed, there had been

lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

As I said at the time

One waits with interest to see whether the [Information Commissioner’s Office (ICO)] will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data

Now, with the launch of the first annual report of the Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott and established at the request of the Secretary of State to “advise, challenge and report on the state of information governance across the health and care system in England”, we see further evidence of HES data “being compromised, the privacy of patients being compromised”. The report informs us of an incident whereby

New inspection procedures introduced by the HSCIC had uncovered a number of organisations which were sending HES data and failing to follow data dictionary standards. This meant they were inadvertently enabling personal confidential data to enter the data base. Following an alert to the Information Commissioners’ Office this was understood as a large scale problem, although having a low level potential impact, as the affected data fields were unknown to either senders or receivers of HES data. The relevant organisations were contacted to gain their cooperation in closing the breach, without alerting any unfriendly observer to the location of the confidential details. This was important to preserve the general ignorance of the detail of the breach and continue to protect individuals’ privacy. Trusts and others were encouraged to provide named contacts who would then start cleaning up their data flows to the HSCIC. In order to manage any untoward reporting in the media, trade titles were informed and briefed about the importance of restricting their reporting to avoid any risk of leading people towards this confidential data.

Now this to me seems pretty serious: those organisations who failed to “follow data dictionary standards” by data controller organisations who were sending HES data sounds very likely to be a contravention of the data controllers’ obligation, under section 4(4) of the Data Protection Act 1998 (DPA) to comply with the seventh data protection principle, which requires that they take

Appropriate technical and organisational measures…against unauthorised or unlawful processing of personal data

Serious contraventions, of a kind likely to cause substantial damage or substantial distress, can result in the ICO serving a monetary penalty notice, under section 55A of the DPA, to a maximum of £500,000.

So, what does one make of these incidents? It’s hard to avoid the conclusion that they would be held to be “serious”, and if the data in question had been misused, there would have been the potential for substantial damage and substantial distress – public disclosure of hospital record data could have a multitude of pernicious effects – and this much is evidenced by the fact that (successful) attempts had to be made to avoid the errors coming to light, including asking journalists to avoid reporting. But were they contraventions likely to cause these things? IIGOP suggests that they had a “low level potential impact” because the data was hidden within large amounts of non-offensive data, and I think it is probably the case that the incidents would not be held to have been likely to cause substantial damage or substantial distress (in Niebel, the leading case on monetary penalty notices, Wikeley J in the Upper Tribunal accepted that the likely in s55A DPA took the same meaning attributed to it by Munby J, in R (Lord) v Secretary of State for the Home Department [2003] EWHC 2073 (Admin), namely “‘likely’ meant something more than ‘a real risk’, i.e. a significant risk, ‘even if the risk falls short of being more probable than not'”).

But a monetary penalty notice is not the only action open to the ICO. He has the power to serve enforcement notices, under s40 DPA, to require data controllers to do, or refrain from doing, specified actions, or to take informal action such as requiring the signing of undertakings (to similar effect). Given that we have heard about these incidents from IIGOP, and in an annual report, it seems unlikely that any ICO enforcement action will be forthcoming. Perhaps that’s correct as a matter of law and as a matter of the exercise of discretion, but in my view the ICO has not been vocal enough about the profound issues raised by the amalgamation and sharing of health data, and the concerns raised by incidents of potentially inappropriate or excessive processing. Care.data of course remains on the agenda, and the IIGOP report is both revealing and encouragingly critical of what has taken place so far, but one would not want a situation to emerge where the ICO took a back seat and allowed IIGOP (which lacks regulatory and enforcement powers) to deal with the issue.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under care.data, Data Protection, data sharing, Information Commissioner, NHS

Data protection implications of sale of Tesco Clubcard company

 

News that Tesco is considering selling its loyalty card business Dunnhumby raises questions about what might happen to cardholders’ personal data

In 1995 the then Chairman of Tesco, Lord MacLaurin, reportedly said to the creators of the Tesco Clubcard scheme

What scares me about this is that you know more about my customers after three months than I know after 30 years.

Since then the sophistication and power of data analytics have increased exponentially and Dunnhumby claims it analyses data from 770 million-plus customers, about 16.5 million of whom are – it seems – Tesco Clubcard members. Dunnhumby, as a data processor for Tesco, processes the personal data of those millions of members, so what happens if the business is sold? Does the customer database also get sold? If so, what are the data protection implications?

Sales of customer databases can be effected lawfully and in compliance with the Data Protection Act 1998 (DPA), as the Information Commissioner’s Office explains in helpful guidance

When a database is sold, the seller must make sure that the buyer understands that they can only use the information for the purposes for which it was collected. Any use of this personal information should be within the reasonable expectations of the individuals concerned. So, when a database is sold, its use should stay the same or similar. For example, if the database contains information obtained for insurance, the database should only be sold to another insurance-based business providing similar insurance products. Selling it to a business for a different use is likely to be incompatible  with the original purpose and likely to go beyond the expectations of the individuals.

The operative words there are, I suggest “expectations of the individuals concerned”. “Reasonable expectations” are strongly linked to the first principle in Schedule One of the DPA, which requires that “personal data shall be processed fairly and lawfully…”. The interpretative provisions in Part II of Schedule One explain that broadly, for processing to be fair, data subjects should be told who is doing the processing, and why. These provisions are the genesis of the “privacy notices” and “privacy policies” which so few of us take the time to read. But their Clubcard privacy policy is where things might become problematic for Tesco in the event that they propose to sell Dunhumby and cardholders’ data. As twitter user @NoDPISigma points out, the Customer Charter says

We would like to reassure you that your personal details are safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

and the separate Privacy and Cookies Policy also says

Your personal information is safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

Although at first blush it is difficult to see that as anything other than an unequivocal promise that cardholders’ personal data will never be sold, the rub is in the phrase “for their marketing purposes”. If the sale of Dunnhumby and cardholders’ data is to another company in order that that other company can continue to operate the Clubcard scheme on behalf of Tesco then, as long as that was all that the data continued to be used for, I don’t think it would be a release of personal data to a company for that company’s marketing purposes. If, however, the purchasing company intended to use the data for its own marketing purposes, then the sale might be a breach of the charter promise – and, in that event, it would be strongly arguable that the sale could give rise to a serious contravention of Tesco’s obligation (at section 4(4) of the DPA) to comply with the fairness principle.

And among those 16.5 million Clubcard holders there are likely to be some awkward so-and-sos who might bring legal challenges in those circumstances.

[This post was edited because in its first draft it failed properly to consider the issue of data controller/processor. Thanks to Rich Greenhill for prompting me into a redraft]

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, marketing

Should victims of “revenge porn” be granted anonymity?

I got into an interesting twitter discussion a few days ago with a journalist who had run a story* about a woman convicted under the Malicious Communications Act 1988 (MCA) for uploading a sex tape involving a former friend of hers. The story named the offender, but also the victim, and I asked Luke Traynor, the Mirror journalist, whether he had considered not naming the latter, who was the victim of what I described as a “sexual crime”.  To his credit, Luke replied, saying that he’d “Checked the law, and she’s not a sexual crime victim, but a victim of malicious communication”.

I think Luke is partly correct – a victim of a section 1 MCA offence is not classed as a victim of a specified sexual offence pursuant to section 2 of the Sexual Offences (Amendment) Act 1992, and is not, therefore, automatically granted lifetime anonymity from the press under section 1. This is the case even where – as here – the crime was a targeted attempt to embarrass or damage the victim on the basis of their sexual behaviour. The Mirror even described this case as one of “Revenge Porn” and, indeed, moves are currently being made to create a specific offence of disclosing private sexual photographs and films with intent to cause distress (clause 33 of the Criminal Justice and Courts Bill refers). If that Bill is passed, I would argue that serious thought should be given to awarding anonymity to victims of this offence.

But merely because statutory anonymity was not available to the victim of the offence reported by the Mirror it does not mean that it was right to name her, and (as you might expect from me) I think that data protection law is in play. Information relating to an identifiable individual’s sexual life is her sensitive personal data, afforded particular protection under the Data Protection Directive 95/46 and the UK Data Protection Act 1998 (DPA) to which it gives domestic effect. Publication of sensitive personal data without one of the conditions in Schedule 3 of the DPA being met (and I cannot see which would be met in this instance) is as a general rule unlawful. There is though, at section 32 of the DPA, as I have written about recently, an effective exemption from most of the Act for personal data processed only for the purposes of journalism. I suspect The Mirror, or any other media outlet naming the victim in this case, would claim this exemption, but it is important to note that, as broad as the exemption is, it can only be claimed if

the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and…the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with [journalism]

I invited Luke to explain whether he thought that publication of the victim’s name was in the public interest, but his reply

It was said in a public court, in accordance with the law, which takes into account ethics and public interest

did not really deal with the section 32 point – just because something was said in public court it does not mean that it is in the public interest to publish it. And unless Luke (or, rather, the Mirror, as data controller) reasonably believed that it was so, the exemption falls away.

Of course, in the absence of any complaint from the individual, all of this might seem otiose. But I think it raises further important issues about the extent of the section 32 exemption, as well as whether there should be some clearer right to privacy for victims of certain types of communications offences.

And, as Tim Turner pointed out, this sort of story shows why some might want to exercise a “right to be forgotten” – if unnecessary and unfair information is published about them on the internet, can some people be blamed for wanting it removed, or made less prominent?

*I have avoided linking directly to the article in question for reasons which should be obvious, given the content of this post. However, it is not difficult to find. That, of course, is the problem. 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under communications offence, Data Protection, Privacy

Are we all journalists?

The ICO has said that Global Witness can claim the data protection exemption for journalism, regarding their investigations in BSGR. This fascinating case continues to raise difficult and important questions.

Data protection law rightly gives strong protection to journalism; this is something that the 2012 Leveson inquiry dealt with in considerable detail, but, as the inquiry’s terms of reference were expressly concerned with “the press”, with “commercial journalism”, it didn’t really grapple with the rather profound question of “what is journalism?” But the question does need to be asked, because in the balancing exercise between privacy and freedom of expression too much weight afforded to one side can result in detriment to the other. If personal privacy is given too much weight, freedom of expression is weakened, but equally if “journalism” is construed too widely, and the protection afforded to journalism is consequently too wide, then privacy rights of individuals will suffer.

In 2008 the Court of Justice of the European Union (CJEU) was asked, in the Satamedia case, to consider the extent of the exemption from a large part of data protection law for processing of personal data for “journalistic” purposes. Article 9 of the European Data Protection Directive (the Directive) provides that

Member States shall provide for exemptions or derogations…for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression.

and recital 37 says

Whereas the processing of personal data for purposes of journalism or for purposes of literary of artistic expression, in particular in the audiovisual field, should qualify for exemption from the requirements of certain provisions of this Directive in so far as this is necessary to reconcile the fundamental rights of individuals with freedom of information and notably the right to receive and impart information

In Satamedia one of the questions the CJEU was asked to consider was whether the publishing of public-domain taxpayer data by two Swedish companies could be “regarded as the processing of personal data carried out solely for journalistic purposes within the meaning of Article 9 of the directive”. To this, the Court replied “yes”

Article 9 of Directive 95/46 is to be interpreted as meaning that the activities [in question], must be considered as activities involving the processing of personal data carried out ‘solely for journalistic purposes’, within the meaning of that provision, if the sole object of those activities is the disclosure to the public of information, opinions or ideas [emphasis added]

One can see that, to the extent that Article 9 is transposed effectively in domestic legislation, it affords significant and potentially wide protection for “journalism”. In the UK it is transposed as section 32 of the Data Protection Act 1998 (DPA). This provides that

Personal data which are processed only for the special purposes are exempt from any provision to which this subsection relates if—

(a)the processing is undertaken with a view to the publication by any person of any journalistic, literary or artistic material,

(b)the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and

(c)the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with the special purposes.

where “the special purposes” are one or more of “the purposes of journalism”, “artistic purposes”, and “literary purposes”. Section 32 DPA exempts data processed for the special purposes from all of the data protection principles (save the 7th, data security, principle) and, importantly from provisions of sections 7 and 10. Section 7 is the “subject access” provision, and normally requires a data controller, upon receipt of written request by an individual, to inform them if their personal data is being processed, and, if it is, to give the particulars and to “communicate” the data to the individual. Section 10 broadly allows a data subject to object to processing which is likely to cause substantial damage or substantial distress, and to require the data to controller to cease (or not begin) processing (and the data controller must either comply or state reasons why it will not). Personal data processed for the special purposes are, therefore, exempt from subject access and from the right to prevent processing likely to cause damage or distress. It is not difficult to see why – if the subject of, say, investigative journalism, could find out what a journalist was doing, and prevent her from doing it, freedom of expression would be inordinately harmed.

The issue of the extent of the journalistic data protection exemption came into sharp focus towards the end of last year, when Benny Steinmetz and three other claimants employed by or associated with mining and minerals group Benny Steinmetz Group Resources (BSGR) brought proceedings in the High Court under the DPA seeking orders that would require campaigning group Global Witness to comply with subject access requests by the claimants, and to cease processing their data. The BSGR claimants had previously asked the Information Commissioner’s Office (ICO), pursuant to the latter’s duties under section 42 DPA, to assess the likelihood of the lawfulness of Global Witness’s processing, and the ICO had determined that it was unlikely that Global Witness were complying with their obligations under the DPA.

However, under section 32(4) DPA, if, in any relevant proceedings, the data controller claims (or it appears to the court) that the processing in question was for the special purposes and with a view to publication, the court must stay the proceedings in order for the ICO to consider whether to make a specific “special purposes” determination by the ICO. Such a determination would be (under section 45 DPA) that the processing was not for the special purposes nor was it with a view to publication, and it would result in a “special information notice”. Such a stay was applied to the BSGR proceedings and, on 15 December, after some considerable wait, the ICO conveyed to the parties that it was “satisfied that Global Witness is only processing the personal data requested … for the purposes of journalism”. Accordingly, no special information notice was served, and the proceedings remain stayed. Although media reports (e.g. Guardian and Financial Times) talk of appeals and tribunals, no direct appeal right exists for a data subject in these circumstances, so, if as seems likely, BSGR want to revive the proceedings, they will presumably either have to apply to have the stay lifted or/and issue judicial review proceedings against the ICO.

The case remains fascinating. It is easy to applaud a decision in which a plucky environmental campaign group claims journalistic data protection exemption regarding its investigations of a huge mining group. But would people be so quick to support, say, a fascist group which decided to investigate and publish private information about anti-fascist campaigners? Could that group also gain data protection exemption claiming that the sole object of their processing was the disclosure to the public of information, opinions or ideas? Global Witness say that

The ruling confirms that the Section 32 exemption for journalism in the Data Protection Act applies to anyone engaged in public-interest reporting, not just the conventional media

but it is not immediately clear from where they import the “public-interest” aspect – this does not appear, at least not in explicit terms, in either the Directive or the DPA. It is possible that it can be inferred, when one considers that processing for special purposes which is not in the public interest might constitute an interference with respect for data subjects’ fundamental rights and freedoms (per recital 2 of the Directive). And, of course, with talk about public interest journalism, we walk straight back into the arguments provoked by the Leveson inquiry.

Furthermore, one notes that the Directive talks about exemption for processing of personal data carried out solely for journalistic purposes, and the DPA says “personal data which are processed only for the special purposes are exempt…”. This was why I emphasised the words in the Satamedia judgment quoted above, which talks similarly of the exemption applying if the “sole object of those activities is the disclosure to the public of information, opinions or ideas”. One might ask whether a campaigning group’s sole or only purpose for processing personal data is for journalism. Might they not, in processing the data, be trying to achieve further ends? Might, in fact, one say that the people who engage solely in the disclosure to public of information, opinions or ideas are in fact those we more traditionally think of in these terms…the press, the commercial journalists?

P.S. Global Witness have uploaded a copy of the ICO’s decision letter. This clarifies that the latter was satisfied that the former was processing for the special purposes because it was part of “campaigning journalism” even though the proposed future publication of the information “forms part of a wider campaign to promote a particular cause”. This chimes with the ICO’s data protection guidance for the media, but it will be interesting if it is challenged on the basis that it doesn’t support a view that the processing is “only” or “solely” for the special purposes.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

5 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner, journalism, Leveson

Hidden data in FOI disclosures

The Hackney Gazette reports that details of 15,000 residents have been published on the internet after Hackney Council apparently inadvertently disclosed the data when responding to a Freedom of Information (FOI) request made using the WhatDoTheyKnow site.

This is not the first time that such apparently catastrophic inadvertent disclosures have happened through WhatDoTheyKnow, and, indeed, in 2012 MySociety, who run the site, issued a statement following a similar incident with Islington Council. As that made clear

responses sent via WhatDoTheyKnow are automatically published online without any human intervention – this is the key feature that makes this site both valuable and popular

It is clearly the responsibility of the authorities in question to ensure that no hidden or exempt information is included in FOI disclosures via WhatDoTheyKnow, or indeed, in FOI disclosures in general. A failure to have appropriate organisational and technical safeguards in place can lead to enforcement action by the Information Commissioner’s Office for contraventions of the Data Protection Act 1998 (DPA): Islington ended up with a monetary penalty notice of £70,000 for their incident, which involved 2000 people. Although the number of data subjects involved is not the only factor the ICO will take into account when deciding what action to take, it is certainly a relevant one: 15000 affected individuals is a hell of a lot.

What concerns me is this sort of thing keeps happening. We don’t know the details of this incident yet, but with such large numbers of data subjects involved it seems likely that it will have involved some sort of dataset, and I would not be at all surprised if it involved purportedly masked or hidden data, such as in a pivot table [EDIT – I’m given to understand that this incident involved cached data in MS Excel]. Around the time of the Islington incident the ICO’s Head of Policy Steve Wood published a blog post drawing attention to the risks. A warning also takes the form of a small piece on a generic page about request handling, which says

take care when using pivot tables to anonymise data in a spreadsheet. The spreadsheet will usually still contain the detailed source data, even if this is hidden and not immediately visible at first glance. Consider converting the spreadsheet to a plain text format (such as CSV) if necessary.

This is fine, but does it go far enough? Last year I wrote on the Guardian web site, and called for greater efforts to be made to highlight the issue. I think that what I wrote then still holds

The ICO must work with the government to offer advice direct to chief executives and those reponsible for risk at councils and NHS bodies (and perhaps other bodies, but these two sectors are probably the highest risk ones). So far these disclosure errors do not appear to have led to harm to those individuals whose private information was compromised, but, without further action, I fear it is only a matter of time.

Time will tell whether this Hackney incident results in a finding of DPA contravention, and ICO enforcement, but in the interim I wish the word would get spread around about how to avoid disclosing hidden data in spreadsheets.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, Freedom of Information, Information Commissioner, monetary penalty notice

FOI disclosure of personal data: balancing of interests

In June this year I blogged about the case of AB v A Chief Constable (Rev 1) [2014] EWHC 1965 (QB). In that case, Mr Justice Cranston had held that, when determining whether personal data is being or has been processed “fairly” (pursuant to the first principle of Schedule One of the Data Protection Act 1998 (DPA))

assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I was surprised by this reading in of an interests balance to the first principle, and said so in my post. Better people than I disagreed, and I certainly am even less sure now than I was of the correctness of my view.

In any case, the binding authority of the High Court rather trumps my meanderings, and it is cited in a recent decision of the First-tier Tribunal (Information Rights) in support of a ruling that the London Borough of Merton Council must disclose, under the Freedom of Information Act 2000 (FOIA), an email sent to a cabinet member of that council by Stephen Hammond MP. The Tribunal, in overturning the decision of the Information Commissioner, considered the private interests of Mr Hammond, including the fact that he had objected to the disclosure, but felt that these did not carry much weight:

we do not consider anything in the requested information to be particularly private or personal and that [sic] this substantially weakens the weight of interest in nondisclosure…We accept that Mr Hammond has objected to the disclosure, which in itself carries some weight as representing his interests. However, asides from an expectation of a general principle of non-disclosure of MP correspondence, we have not been given any reason for this. We have been given very little from the Commissioner to substantiate why Members of Parliament would have an expectation that all their correspondence in relation to official work remain confidential

and balanced against these were the public interests in disclosure, including

no authority had been given for the statement [in the ICO’s decision notice] that MPs expect that all correspondence to remain confidential…[;]…withholding of the requested information was not compatible with the principles of accountability and openness, whereby MPs should subject themselves to public scrutiny, and only withhold information when the wider public interest requires it…[;]…the particular circumstances of this case [concerning parking arrangements in the applicant’s road] made any expectation of confidentiality unreasonable and strongly indicated that disclosure would be fair

The arguments weighed, said the Tribunal, strongly in favour of disclosure.

A further point fell to be considered, however: for processing of personal data to be fair and lawful (per the first data protection principle) there must be met, beyond any general considerations, a condition in Schedule Two DPA. The relevant one, condition 6(1) requires that

The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject

It has to be noted that “necessary” here in the DPA imports a human rights proportionality test and it “is not synonymous with ‘indispensable’…[but] it implies the existence of a ‘pressing social need'” (The Sunday Times v United Kingdom (1979) 2 EHRR 245). The Tribunal, in what effectively was a reiteration of the arguments about general “fairness”, accepted that the condition would be met in this case, citing the applicant’s arguments, which included the fact that

disclosure is necessary to meet the public interest in making public what Mr Hammond has said to the Council on the subject of parking in Wimbledon Village, and that as an elected MP, accountable to his constituents, disclosure of such correspondence cannot constitute unwarranted prejudice to his interests.

With the exception of certain names within the requested information, the Tribunal ordered disclosure.  Assessing “fairness” now, following Mr Justice Cranston, and not following me, clearly does involve balancing the interests of the data subject against the public interest in disclosure.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner, Information Tribunal

The wrong test for anonymisation?

UPDATE: 23.01.15 The ICO has responded [.doc file] to my request for a review of their decision. I drew their attention to the arguments on this page but they don’t even mention them, let alone provide a counter-analysis, in dismissing my complaints (“Having reviewed the matter, I agree with the explanations provided”). I am invited by the ICO to consider taking my own legal action. I understand that the ICO and I might have differing views on a DPA matter, but what I find difficult to accept is the refusal even to enter into a discussion with me about the detailed arguments I’ve made. END UPDATE

In February this year I asked the Information Commissioner’s Office (ICO) to investigate reports that Hospital Episode Statistics (HES) data had apparently been sold to an actuarial society by the NHS Information Centre (NHSIC), the predecessor to the Health and Social Care Information Centre (HSCIC). Specifically I requested, as a data subject can under s42 of the Data Protection Act 1998 (DPA), that the ICO assess whether it was likely or not that the processing of my personal data by NHSIC and others had been in compliance with the DPA.

Nine months later, I was still awaiting the outcome. But a clue to how the assessment would turn out was contained in the text of Sir Nick Partridge’s six month review of various data releases by NHSIC (his original report in June seemed to me to point to multiple potential DPA contraventions). In the review document he says

Six investigations have been separately instigated by the HSCIC or Information Commissioner’s Office (ICO)and shared with both parties as these focussed on whether individuals were at risk of being identified. In the cases it has investigated, the ICO has upheld the HSCIC approach and informed us that it has “seen no evidence to suggest that re-identification has occurred or is reasonably likely to occur.”
And sure enough, after chasing the ICO for the outcome of my nine-month wait, I received this (in oddly formatted text, which rather whiffed of a lot of cutting-and-pasting)
Following the recent issue regarding HSCIC, PA Consulting, and Google we investigated the issue of whether HES data could be considered personal data. This detailed work involved contacting HSCIC, PA Consulting, and Google and included the analysis of the processes for the extraction and disclosure of HES data both generally and in that case in particular. We concluded that we did not consider that the HES dataset constitutes personal data.Furthermore we also investigated whether this information had been linked to other data to produce “personal data” which was subject to the provisions of the Act. We have no evidence that there has been any re-identification either on the part of PA Consulting or Google. We also noted that HSCIC have stated that the HES dataset does not include individual level patient data even at a pseudonymised level. Our view is that the data extracted and provided to PA Consulting did not identify any individuals and there was no reasonable likelihood that re-identification would be possible.
I have added the emphasis to the words “reasonable likelihood” above. They appear in similar terms in the Partridge Review, and they struck me as rather odd. An awful lot of analysis has taken and continues to take place on the subject of when can personal data be “rendered fully anonymous in the sense that it is information from which the data subject is no longer identifiable” (Lord Hope’s dicta in Common Services Agency v Scottish Information Commissioner [2008] UKHL 47). Some of that analysis has been academic, some takes the form of “soft law” guidance, for instance Opinion 05/2014 of the Article 29 Working Party, and the ICO Anonymisation Code of Practice. The former draws on the Data Protection Directive 95/46/EC, and notes that

Recital 26 signifies that to anonymise any data, the data must be stripped of sufficient elements such that the data subject can no longer be identified. More precisely, that data must be processed in such a way that it can no longer be used to identify a natural person by using “all the means likely reasonably to be used”

Anonymisation has also been subject to judicial analysis, notably in the Common Services Agency case, but, even more key, in the judgment of Mr Justice Cranston in Department of Health v Information Commissioner ([2011] EWHC 1430). The latter case, involving the question of disclosure of late-term abortion statistics, is by no means an easy judgment to parse (ironically so, given that it makes roughly the same observation of the Common Services Agency case). The judge held that the First-tier Tribunal had been wrong to say that the statistics in question were personal data, but that it had on the evidence been entitled to say that “the possibility of identification by a third party from these statistics was extremely remote”. The fact that the possibility of identification by a third party was extremely remote meant that “the requested statistics were fully anonymised” (¶55). I draw from this that for personal data to be anonymised in statistical format the possibility of identification of individuals by a third party must be extremely remote. The ICO’s Anonymisation Code, however, says of the case:

The High Court in the Department of Health case above stated that the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA [emphasis added]

But this seems to me to be an impermissible description of the case – the High Court did not state what the ICO says it stated – the phrases “greater than remote” and “reasonably likely” do not appear in the judgment. And that phrase “reasonably likely” is one that, as I say, makes it way into the Partridge Review, and the ICO’s assessment of the lawfulness of HES data “sale”.

I being to wonder if the ICO has taken the phrase from recital 26 of the Directive, which talks about the need to consider “all the means likely reasonably to be used” to identify an individual, and transformed it into a position from which, if identification is not reasonably likely, it will accept that data are anonymised. This cannot be right: there is a world of difference between a test which considers whether possibility of identification is “extremely remote” and whether it is “reasonably likely”.

I do not have a specific right to a review of the section 42 assessment decision that the processing of my personal data was likely in compliance with NHSIC’s obligations under the DPA, but I have asked for one. I am aware of course that others complained (après moi, la deluge) notably, in March, FIPR, MedConfidential and Big Brother Watch . I suspect they will also be pursuing this.

In October this year I attended an event at which the ICO’s Iain Bourne spoke. Iain was a key figure in the drawing up of the ICO’s Anonymisation Code, and I took the rather cheeky opportunity to ask about the HES investigations. He said that his initial view was that NHSIC had been performing good anonymisation practice. This reassured me at the time, but now, after considering this question of whether the Anonymisation Code (and the ICO) adopts the wrong test on the risks of identification, I am less reassured. Maybe “reasonably likely that an individual can be identified” is an appropriate test for determining when data is no longer anonymised, and becomes personal data, but it does not seem to me that the authorities support it.

Postscript Back in August of this year I alerted the ICO to the fact that a local authority had published open data sets which enabled individuals to be identified (for instance, social care and housing clients). More than four months later the data is still up (despite the ICO saying they would raise the issue with the council): is this perhaps because the council has argued that the risk of identification is not “reasonably likely”?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under anonymisation, care.data, Data Protection, Directive 95/46/EC, Information Commissioner, NHS