FOI vs Transparency debate

Yesterday, after attending a fascinating and in-depth briefing from Network Rail on their journey towards being subject to the Freedom of Information Act 2000, I was privileged to appear on a panel debating “In a world of Freedom of Information, does voluntary transparency still matter?” Although rather daunted by the illustrious fellow panel members – the Campaign for Freedom of Information‘s Maurice Frankel, the Guardian’s Jane Dudman and Sir Alex Allan KCB1 – I delivered a short address on the subject (as did those others). Perhaps unsurprisingly, the panel were unanimous in feeling that voluntary transparency does still matter in a world of FOI, but, just as importantly, that voluntary transparency does not and should not make FOI redundant. This is broadly what I said, with added hyperlinks:

A very wise man called Tim Turner once wrote: “The point of FOI is that you get to ask about what YOU want to know, not what The Nice Man Wants To Tell You”. And this I think is the key point which distinguishes the access rights afforded to individuals under Freedom of Information and related legislation, from the transparency agenda which has led to the UK government again this week being pronounced the most open and transparent in the world, by Tim Berners Lee’s World Wide Web Foundation.

At the same time as that first place was announced, cynics amongst us might have pointed to the fact that in the 2013 Global Right to Information Ratings compiled by Access Info and the Canadian Centre for Law and Democracy, the UK was in 29th place, behind countries like Kyrgyzstan and Sierra Leone.

There’s clearly a gap in perception there, and one that is not simply explained away by questions about methodology.

In 2012 Francis Maude said “I’d like to make Freedom of Information redundant, by pushing out so much data that people won’t have to ask for it”. While this is in some ways a laudable aim, it is simply never going to wash: there will always be some information which Mr Maude doesn’t want disclosed, but which I, or, you, or someone else, does (to illustrate this one only has to look at how regularly the Cabinet Office claims FOI exemptions and refuses to disclose).

By the same token Network Rail, who have disclosed an impressive amount of valuable data over recent years, would not, I am sure, pretend that they expect only ever to disclose information in response to FOI requests, when they come under the Act’s coverage in a few months. There will clearly be information which they will not be able to disclose (and for perfectly valid reasons).

The transparency agenda cannot simply sweep away concerns about disclosure of commercially sensitive information, or of personal data, or of information which might prejudice national security. But there will always be people who want this information, and there will always be the need for a legal framework to arbitrate disputes about disclosure, and particularly about whether the public interest favours disclosure or not.

And, as a brief aside, I think there’s an inherent risk in an aggressive, or, rather, enthusiastic, approach to publication under a transparency agenda – sometimes information which shouldn’t be published does get published. I have seen some nasty erroneous, and even deliberate, disclosures of personal data within Open Datasets. The framework of FOI should, in principle at least, provide a means of error-checking before disclosure.

When FOI was in its infancy we were assured that effective and robust publication schemes would ultimately reduce the amount of time spent dealing with FOI requests – “Point them to the publication scheme” we were told…While I am sure that, on some level, this did transpire, no one I have spoken to really feels that proactive publication via a publication scheme has led to a noticeable decrease in FOI requests. And I think the same applies with the Transparency Agenda – as much as Mr Maude would like to think it will make FOI redundant, it has, and will continue to have, only a minor effect on the (necessary) burden that FOI places on public authorities.

I do not think we are going to see either the Transparency Agenda dispense with FOI, nor FOI dispense with the Transparency Agenda: they are, if not two sides of the same coin, at least two different coins in the same purse. And we should always bear in mind that public scrutiny of public authorities is not just about what the Nice Man Wants To Tell You, but is equally about what the Nasty Man Doesn’t Want To Tell You.

1I’m delighted to see from his Wikipedia entry that Sir Alex is a huge Grateful Dead fan, and that further research suggests that this isn’t just Wikipedian inaccuracy

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Freedom of Information, transparency

UKIP Dartford and data protection compliance

The Telegraph recently highlighted a rather bizarre incident involving the sending of a letter by the secretary of UKIP’s Dartford branch. The letter purports to be from a Simon Blanchard in his capacity as, or as a representative of UKIP Dartford. It appears that Mr Blanchard had taken offence at what he said was a verbal insult directed at him by the recipient of the letter, a Mr Kemp, and chose to write expressing his annoyance both at this, and also expressing his rather extraordinary interpretation of the effect of European Union laws on the UK. But Mr Blanchard did something else – he sent copies of the letter to Mr Kemp’s neighbours. In doing so it is questionable whether Mr Blanchard, and UKIP Dartford, have complied with their obligations under the Data Protection Act 1998 (DPA).

I am presuming that UKIP Dartford is the local constituency association for UKIP. As such, to the extent that it processes personal data of people of identifiable individuals, and determines the purposes for which and the manner in which the processing occurs, it is a data controller. Constituency associations of political parties are distinct from their national parties (they are often at odds with their national parties) and many Labour and Conservative constituency associations recognise this, by registering their processing with the Information Commissioner’s Office (ICO). Indeed, as data controllers not otherwise exempt, they have a legal obligation (section 18 of the Data Protection Act 1998 (DPA)) to do so, and failing to do so, in circumstances where they are processing personal data and cannot avail themselves of an exemption, is a criminal offence (section 21 DPA). I note that UKIP Dartford don’t have an entry on the ICO’s online register – this (and the broader issue of constituency association registration) might be something the ICO should consider investigating.

Furthermore, if it is a data controller, UKIP Dartford will have a statutory obligation (section 4(4) DPA) to comply with the data protection principles. The first of these is that personal data should be processed “fairly and lawfully”. It is not immediately obvious how Blanchard came to have Mr Kemp’s name and address, but, assuming they were gathered lawfully, the sending of the letter itself may well have been fair and lawful. But where problems would be more likely to emerge, I would suggest, would be in the sending by Blanchard of copies of the letter – containing as it did Mr Kemp’s personal data – to neighbours. “Fairness” in the DPA depends a lot on data subjects’ expectations, and it is hard to believe that the recipient of such a letter would have expected it to be circulated among his neighbours.

It is possible that Mr Blanchard came about the name and address details under regulation 105 of the Representation of the People (England and Wales) Regulations 2001 (as amended), whereby local constituency parties may apply for a copy of the full electoral register. It is important to note, however that, by regulation 105(4), the register can only be used for “electoral purposes or the purposes of electoral registration”. Although one can see that “electoral purposes” might be construed broadly, it is difficult to construct an argument that the sending of the copy-letters, containing the original recipient’s personal data, could possibly have been for electoral purposes. For these reasons, a contravention of the second DPA principle would appear to be likely. That principle restricts further processing of personal data in a manner incompatible with the original purposes.

It may be that there is more to this story than is immediately apparent. Perhaps Mr Blanchard and UKIP Dartford acquired Mr Kemp’s data in a different manner. Perhaps they thought they had consent to send it his neighbours (although given that Mr Kemp’s wife complained – and received the peremptory response “There was no error made on the envelope and hope your neighbours had a good read as well” – this seems unlikely). If more details emerge I will update this post, but in the interim, I can say that the story certainly raises questions about DPA compliance.

The forthcoming general election is likely to see battles fought in many fields (I’ve already drawn attention to the possibility that the legal boundaries of electronic marketing may get pushed to the point of breach on these battlegrounds). One hopes that the ICO will be robust enough to deal with the data protection issues which will emerge, which might include excessive or disproportionate use of people’s personal electoral data.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner

Sensitive personal data exposed in Open Datasets

Since August last year I’ve been inviting the ICO to consider the issue of deliberate wholesale exposure of sensitive personal data in local authority open data. It’s still online.

UPDATE: 16.02.15 Well, I was wrong. The ICO says this is not personal data:

The data sets in question are clearly personal data in the hands of the [redacted] because it will retain the full original dataset containing the identifying details of individuals. However, the question is whether the information is still personal data post-publication. In our view it is not.

Although the data relates to particular living individuals, it does not in itself identify any of them and so in of themselves the data sets do not contain personal data.

The issue then is whether it is likely that a third party will come into the possession of other information that will allow an individual to be identified. To do so, such a person would already need prior knowledge of any given individual in order to identify them. However, we believe the publication of the information to have a low risk of individuals being re-identified because only someone with considerable prior knowledge would be able to perform this task.

We note that you have not identified anybody and the [redacted] has stated that it is unaware of any cases of re-identification as a result of publication.

I honestly struggle to fathom this. I accept the ICO’s further point that

the Information Commissioner only has to give his view about the likelihood that there has been a breach [of the DPA]. This view is made on the balance of probabilities and the Commissioner is under no obligation to prove this beyond doubt

but their assessment doesn’t seem to tally with my understanding of the techniques described in the ICO’s own Anonymisation Code. I’m no expert on that subject, but I wouldn’t dream of publishing the datasets in question, in the form they have been published. If anyone has any observations I’d be really interested to hear them.

And I’m still not linking to the datasets – I think they can identify individuals, and their sensitive personal data.

END UPDATE.

Imagine, if you will, a public authority which decides to publish as Open Data a spreadsheet of 6000 individual records of adults receiving social services support. Each row tells us an individual service user’s client group (e.g. “dementia” or “learning disability”), age range (18-64, 65-84, 84 and over), the council ward they live in, the service they’re receiving (e.g. “day care” or “direct payment” or “home care”), their gender and their ethnicity. If, by burrowing into that data, one could identify information that reveals that one, and only one, Bangladeshi man in the Blankety ward aged 18-64 with a learning disability is in receipt of direct payments, most data protection professionals (and many other people besides) would recognise that this is an identifiable individual, if not to you or me, then almost certainly to some of his neighbours or family or acquaintances.

Similarly, imagine the same public authority decides to publish as Open Data a spreadsheet of nearly 7000 individual records of council housing tenants who have received Notices of Seeking Possession or Notices to Quit. Each row tells us the date individual tenant was served the notice, the council ward, the duration of the tenancy, whether it was joint or sole, the age of the tenant(s) in years, their gender, their ethnicity (if recorded), their disability status (if recorded), their vulnerability status (if recorded). If, by burrowing into that data, one could identify that reveals that one, and only one, 40-year-old Asian Indian male sole tenant with a tenancy 2.94 years old, was served a Notice of Seeking of Possession in June 2006, most data protection professionals (and many other people besides) would recognise that this is an identifiable individual, if not to you or me, then almost certainly to some of his neighbours or family or acquaintances.

If these individuals are identifiable (and, trust me, these are only two examples from hundreds, in many, many spreadsheets), then this is their sensitive personal data which is being processed by the public authority in question (which I am not identifying, for obvious reasons). For the processing to be fair and lawful it needs a legal basis, by the meeting of at least one of the conditions in Schedule Two and one in Schedule Three of the Data Protection Act 1998 (DPA).

And try as I might, I cannot find one which legitimises this processing, not even in the 2000 Order which significantly added to the Schedule 3 conditions. And this was why, when the datasets in question were drawn to my attention, I flagged my concerns up with the public authority

Hi – I notice you’ve uploaded huge amounts of data…some of it at a very high level of granularity – ie with multiple and specific identifiers. According to the definitions in recital 26 and Article 2 of Directive 95/46/EC, s1(1) of the Data Protection Act 1998, and the Information Commissioner’s Office guidance (eg “Determining What is Personal Data” and the Code of Practice on Anonymisation) this is very likely to be personal data and in many cases sensitive personal data. I’m curious to know why you are publishing such datasets in such form, and what the legal basis is to do so

Not receiving any reply, I then contacted the Information Commissioner’s Office, saying

It seems to me that they are processing (including disclosing) large amounts of sensitive personal dataI’m happy to elaborate to ICO if you want, but presume I wouldn’t need to explain exactly why I am concerned.

However, when I received the ICO case worker’s reply, I was rather dumbfounded

You have raised concerns that [redacted] is disclosing large amounts of sensitive personal data on…its website. For information to be personal data it has to relate to a living individual and allow that individual to be identified from the information. I have looked over some of the information…and it appears to be sharing generic data and figures. I could not see any information that identifies any individuals. In order to consider your concerns further it would be extremely helpful if you could provide some examples of where the sensitive personal data can be found and possibly provide a couple of screenshots.

Nonetheless, I replied, giving the two examples above, and the case worker further replied

I have now looked at the examples you have provided and agree that there is the potential for individuals to be identified from the information that [they are] publishing. We will now write to [them] about this matter to obtain some further information about its information rights practices. As this matter does not concern your personal data and relates to third party information we do not intend to write to you again about this matter

I thought the last sentence was a bit odd (nothing prevented them from keeping me informed) but took reassurance that the data would be removed or appropriately anonymised.

But nothing seemed to happen. So I chased the ICO at the end of November. No response. And now I’ve been forced to raise it with the ICO as a complaint:

I understand that you said you would not contact me again about this, but I note that the sensitive personal data is still online. I advise several public sector clients about the online publishing of datasets, with reference to the law and ICO guidance, and the lack of action on this…leaves me quite bemused – do I now advise clients that they are free to publish datasets with such specific and so many identifiers that individuals can be identified? If so, what legal basis do I point to to legitimise the processing?

Public authorities are increasingly being encouraged, as part of the transparency agenda, to make their data publicly available, and to make it available in reusable format, so that it can be subjected to analysis and further use. The ICO has produced generally helpful guidance on successful anonymisation which enables datasets to be removed of personal data. If public authorities fail to follow this guidance, and instead disclose sensitive personal data within those reusable datasets they are potentially exposing individuals to considerable and various risks of harm. Moreover, much of the data in question is gathered pursuant to the public authority’s statutory duties – in other words, data subjects have no ability to opt out, or refuse to give consent to the processing.

One has to ask what this does for the confidence of data subjects in Open Data and the transparency agenda.

I asked the ICO’s always very helpful press office if they wanted to comment, and an ICO spokesperson said: “This is an open case, and we continue to work with the council to explain our concerns about the amount of information being published.” Which raises interesting questions – if they have concerns (and I think I have amply explained here why those concerns are justified) why not take enforcement action to get the data taken down?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Uncategorized

Hospital episode data – confidential data uploaded by mistake

Rather hidden away in the new IIGOP annual report is a worrying and revealing report of a serious data breach involving hospital episode data

In February last year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

However, as Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre in June of last year revealed, there had been

lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

As I said at the time

One waits with interest to see whether the [Information Commissioner’s Office (ICO)] will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data

Now, with the launch of the first annual report of the Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott and established at the request of the Secretary of State to “advise, challenge and report on the state of information governance across the health and care system in England”, we see further evidence of HES data “being compromised, the privacy of patients being compromised”. The report informs us of an incident whereby

New inspection procedures introduced by the HSCIC had uncovered a number of organisations which were sending HES data and failing to follow data dictionary standards. This meant they were inadvertently enabling personal confidential data to enter the data base. Following an alert to the Information Commissioners’ Office this was understood as a large scale problem, although having a low level potential impact, as the affected data fields were unknown to either senders or receivers of HES data. The relevant organisations were contacted to gain their cooperation in closing the breach, without alerting any unfriendly observer to the location of the confidential details. This was important to preserve the general ignorance of the detail of the breach and continue to protect individuals’ privacy. Trusts and others were encouraged to provide named contacts who would then start cleaning up their data flows to the HSCIC. In order to manage any untoward reporting in the media, trade titles were informed and briefed about the importance of restricting their reporting to avoid any risk of leading people towards this confidential data.

Now this to me seems pretty serious: those organisations who failed to “follow data dictionary standards” by data controller organisations who were sending HES data sounds very likely to be a contravention of the data controllers’ obligation, under section 4(4) of the Data Protection Act 1998 (DPA) to comply with the seventh data protection principle, which requires that they take

Appropriate technical and organisational measures…against unauthorised or unlawful processing of personal data

Serious contraventions, of a kind likely to cause substantial damage or substantial distress, can result in the ICO serving a monetary penalty notice, under section 55A of the DPA, to a maximum of £500,000.

So, what does one make of these incidents? It’s hard to avoid the conclusion that they would be held to be “serious”, and if the data in question had been misused, there would have been the potential for substantial damage and substantial distress – public disclosure of hospital record data could have a multitude of pernicious effects – and this much is evidenced by the fact that (successful) attempts had to be made to avoid the errors coming to light, including asking journalists to avoid reporting. But were they contraventions likely to cause these things? IIGOP suggests that they had a “low level potential impact” because the data was hidden within large amounts of non-offensive data, and I think it is probably the case that the incidents would not be held to have been likely to cause substantial damage or substantial distress (in Niebel, the leading case on monetary penalty notices, Wikeley J in the Upper Tribunal accepted that the likely in s55A DPA took the same meaning attributed to it by Munby J, in R (Lord) v Secretary of State for the Home Department [2003] EWHC 2073 (Admin), namely “‘likely’ meant something more than ‘a real risk’, i.e. a significant risk, ‘even if the risk falls short of being more probable than not'”).

But a monetary penalty notice is not the only action open to the ICO. He has the power to serve enforcement notices, under s40 DPA, to require data controllers to do, or refrain from doing, specified actions, or to take informal action such as requiring the signing of undertakings (to similar effect). Given that we have heard about these incidents from IIGOP, and in an annual report, it seems unlikely that any ICO enforcement action will be forthcoming. Perhaps that’s correct as a matter of law and as a matter of the exercise of discretion, but in my view the ICO has not been vocal enough about the profound issues raised by the amalgamation and sharing of health data, and the concerns raised by incidents of potentially inappropriate or excessive processing. Care.data of course remains on the agenda, and the IIGOP report is both revealing and encouragingly critical of what has taken place so far, but one would not want a situation to emerge where the ICO took a back seat and allowed IIGOP (which lacks regulatory and enforcement powers) to deal with the issue.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under care.data, Data Protection, data sharing, Information Commissioner, NHS

Data protection implications of sale of Tesco Clubcard company

 

News that Tesco is considering selling its loyalty card business Dunnhumby raises questions about what might happen to cardholders’ personal data

In 1995 the then Chairman of Tesco, Lord MacLaurin, reportedly said to the creators of the Tesco Clubcard scheme

What scares me about this is that you know more about my customers after three months than I know after 30 years.

Since then the sophistication and power of data analytics have increased exponentially and Dunnhumby claims it analyses data from 770 million-plus customers, about 16.5 million of whom are – it seems – Tesco Clubcard members. Dunnhumby, as a data processor for Tesco, processes the personal data of those millions of members, so what happens if the business is sold? Does the customer database also get sold? If so, what are the data protection implications?

Sales of customer databases can be effected lawfully and in compliance with the Data Protection Act 1998 (DPA), as the Information Commissioner’s Office explains in helpful guidance

When a database is sold, the seller must make sure that the buyer understands that they can only use the information for the purposes for which it was collected. Any use of this personal information should be within the reasonable expectations of the individuals concerned. So, when a database is sold, its use should stay the same or similar. For example, if the database contains information obtained for insurance, the database should only be sold to another insurance-based business providing similar insurance products. Selling it to a business for a different use is likely to be incompatible  with the original purpose and likely to go beyond the expectations of the individuals.

The operative words there are, I suggest “expectations of the individuals concerned”. “Reasonable expectations” are strongly linked to the first principle in Schedule One of the DPA, which requires that “personal data shall be processed fairly and lawfully…”. The interpretative provisions in Part II of Schedule One explain that broadly, for processing to be fair, data subjects should be told who is doing the processing, and why. These provisions are the genesis of the “privacy notices” and “privacy policies” which so few of us take the time to read. But their Clubcard privacy policy is where things might become problematic for Tesco in the event that they propose to sell Dunhumby and cardholders’ data. As twitter user @NoDPISigma points out, the Customer Charter says

We would like to reassure you that your personal details are safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

and the separate Privacy and Cookies Policy also says

Your personal information is safe with us and will never be released to companies outside the Tesco Group for their marketing purposes

Although at first blush it is difficult to see that as anything other than an unequivocal promise that cardholders’ personal data will never be sold, the rub is in the phrase “for their marketing purposes”. If the sale of Dunnhumby and cardholders’ data is to another company in order that that other company can continue to operate the Clubcard scheme on behalf of Tesco then, as long as that was all that the data continued to be used for, I don’t think it would be a release of personal data to a company for that company’s marketing purposes. If, however, the purchasing company intended to use the data for its own marketing purposes, then the sale might be a breach of the charter promise – and, in that event, it would be strongly arguable that the sale could give rise to a serious contravention of Tesco’s obligation (at section 4(4) of the DPA) to comply with the fairness principle.

And among those 16.5 million Clubcard holders there are likely to be some awkward so-and-sos who might bring legal challenges in those circumstances.

[This post was edited because in its first draft it failed properly to consider the issue of data controller/processor. Thanks to Rich Greenhill for prompting me into a redraft]

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, marketing

Should victims of “revenge porn” be granted anonymity?

I got into an interesting twitter discussion a few days ago with a journalist who had run a story* about a woman convicted under the Malicious Communications Act 1988 (MCA) for uploading a sex tape involving a former friend of hers. The story named the offender, but also the victim, and I asked Luke Traynor, the Mirror journalist, whether he had considered not naming the latter, who was the victim of what I described as a “sexual crime”.  To his credit, Luke replied, saying that he’d “Checked the law, and she’s not a sexual crime victim, but a victim of malicious communication”.

I think Luke is partly correct – a victim of a section 1 MCA offence is not classed as a victim of a specified sexual offence pursuant to section 2 of the Sexual Offences (Amendment) Act 1992, and is not, therefore, automatically granted lifetime anonymity from the press under section 1. This is the case even where – as here – the crime was a targeted attempt to embarrass or damage the victim on the basis of their sexual behaviour. The Mirror even described this case as one of “Revenge Porn” and, indeed, moves are currently being made to create a specific offence of disclosing private sexual photographs and films with intent to cause distress (clause 33 of the Criminal Justice and Courts Bill refers). If that Bill is passed, I would argue that serious thought should be given to awarding anonymity to victims of this offence.

But merely because statutory anonymity was not available to the victim of the offence reported by the Mirror it does not mean that it was right to name her, and (as you might expect from me) I think that data protection law is in play. Information relating to an identifiable individual’s sexual life is her sensitive personal data, afforded particular protection under the Data Protection Directive 95/46 and the UK Data Protection Act 1998 (DPA) to which it gives domestic effect. Publication of sensitive personal data without one of the conditions in Schedule 3 of the DPA being met (and I cannot see which would be met in this instance) is as a general rule unlawful. There is though, at section 32 of the DPA, as I have written about recently, an effective exemption from most of the Act for personal data processed only for the purposes of journalism. I suspect The Mirror, or any other media outlet naming the victim in this case, would claim this exemption, but it is important to note that, as broad as the exemption is, it can only be claimed if

the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and…the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with [journalism]

I invited Luke to explain whether he thought that publication of the victim’s name was in the public interest, but his reply

It was said in a public court, in accordance with the law, which takes into account ethics and public interest

did not really deal with the section 32 point – just because something was said in public court it does not mean that it is in the public interest to publish it. And unless Luke (or, rather, the Mirror, as data controller) reasonably believed that it was so, the exemption falls away.

Of course, in the absence of any complaint from the individual, all of this might seem otiose. But I think it raises further important issues about the extent of the section 32 exemption, as well as whether there should be some clearer right to privacy for victims of certain types of communications offences.

And, as Tim Turner pointed out, this sort of story shows why some might want to exercise a “right to be forgotten” – if unnecessary and unfair information is published about them on the internet, can some people be blamed for wanting it removed, or made less prominent?

*I have avoided linking directly to the article in question for reasons which should be obvious, given the content of this post. However, it is not difficult to find. That, of course, is the problem. 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under communications offence, Data Protection, Privacy

Are we all journalists?

The ICO has said that Global Witness can claim the data protection exemption for journalism, regarding their investigations in BSGR. This fascinating case continues to raise difficult and important questions.

Data protection law rightly gives strong protection to journalism; this is something that the 2012 Leveson inquiry dealt with in considerable detail, but, as the inquiry’s terms of reference were expressly concerned with “the press”, with “commercial journalism”, it didn’t really grapple with the rather profound question of “what is journalism?” But the question does need to be asked, because in the balancing exercise between privacy and freedom of expression too much weight afforded to one side can result in detriment to the other. If personal privacy is given too much weight, freedom of expression is weakened, but equally if “journalism” is construed too widely, and the protection afforded to journalism is consequently too wide, then privacy rights of individuals will suffer.

In 2008 the Court of Justice of the European Union (CJEU) was asked, in the Satamedia case, to consider the extent of the exemption from a large part of data protection law for processing of personal data for “journalistic” purposes. Article 9 of the European Data Protection Directive (the Directive) provides that

Member States shall provide for exemptions or derogations…for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression.

and recital 37 says

Whereas the processing of personal data for purposes of journalism or for purposes of literary of artistic expression, in particular in the audiovisual field, should qualify for exemption from the requirements of certain provisions of this Directive in so far as this is necessary to reconcile the fundamental rights of individuals with freedom of information and notably the right to receive and impart information

In Satamedia one of the questions the CJEU was asked to consider was whether the publishing of public-domain taxpayer data by two Swedish companies could be “regarded as the processing of personal data carried out solely for journalistic purposes within the meaning of Article 9 of the directive”. To this, the Court replied “yes”

Article 9 of Directive 95/46 is to be interpreted as meaning that the activities [in question], must be considered as activities involving the processing of personal data carried out ‘solely for journalistic purposes’, within the meaning of that provision, if the sole object of those activities is the disclosure to the public of information, opinions or ideas [emphasis added]

One can see that, to the extent that Article 9 is transposed effectively in domestic legislation, it affords significant and potentially wide protection for “journalism”. In the UK it is transposed as section 32 of the Data Protection Act 1998 (DPA). This provides that

Personal data which are processed only for the special purposes are exempt from any provision to which this subsection relates if—

(a)the processing is undertaken with a view to the publication by any person of any journalistic, literary or artistic material,

(b)the data controller reasonably believes that, having regard in particular to the special importance of the public interest in freedom of expression, publication would be in the public interest, and

(c)the data controller reasonably believes that, in all the circumstances, compliance with that provision is incompatible with the special purposes.

where “the special purposes” are one or more of “the purposes of journalism”, “artistic purposes”, and “literary purposes”. Section 32 DPA exempts data processed for the special purposes from all of the data protection principles (save the 7th, data security, principle) and, importantly from provisions of sections 7 and 10. Section 7 is the “subject access” provision, and normally requires a data controller, upon receipt of written request by an individual, to inform them if their personal data is being processed, and, if it is, to give the particulars and to “communicate” the data to the individual. Section 10 broadly allows a data subject to object to processing which is likely to cause substantial damage or substantial distress, and to require the data to controller to cease (or not begin) processing (and the data controller must either comply or state reasons why it will not). Personal data processed for the special purposes are, therefore, exempt from subject access and from the right to prevent processing likely to cause damage or distress. It is not difficult to see why – if the subject of, say, investigative journalism, could find out what a journalist was doing, and prevent her from doing it, freedom of expression would be inordinately harmed.

The issue of the extent of the journalistic data protection exemption came into sharp focus towards the end of last year, when Benny Steinmetz and three other claimants employed by or associated with mining and minerals group Benny Steinmetz Group Resources (BSGR) brought proceedings in the High Court under the DPA seeking orders that would require campaigning group Global Witness to comply with subject access requests by the claimants, and to cease processing their data. The BSGR claimants had previously asked the Information Commissioner’s Office (ICO), pursuant to the latter’s duties under section 42 DPA, to assess the likelihood of the lawfulness of Global Witness’s processing, and the ICO had determined that it was unlikely that Global Witness were complying with their obligations under the DPA.

However, under section 32(4) DPA, if, in any relevant proceedings, the data controller claims (or it appears to the court) that the processing in question was for the special purposes and with a view to publication, the court must stay the proceedings in order for the ICO to consider whether to make a specific “special purposes” determination by the ICO. Such a determination would be (under section 45 DPA) that the processing was not for the special purposes nor was it with a view to publication, and it would result in a “special information notice”. Such a stay was applied to the BSGR proceedings and, on 15 December, after some considerable wait, the ICO conveyed to the parties that it was “satisfied that Global Witness is only processing the personal data requested … for the purposes of journalism”. Accordingly, no special information notice was served, and the proceedings remain stayed. Although media reports (e.g. Guardian and Financial Times) talk of appeals and tribunals, no direct appeal right exists for a data subject in these circumstances, so, if as seems likely, BSGR want to revive the proceedings, they will presumably either have to apply to have the stay lifted or/and issue judicial review proceedings against the ICO.

The case remains fascinating. It is easy to applaud a decision in which a plucky environmental campaign group claims journalistic data protection exemption regarding its investigations of a huge mining group. But would people be so quick to support, say, a fascist group which decided to investigate and publish private information about anti-fascist campaigners? Could that group also gain data protection exemption claiming that the sole object of their processing was the disclosure to the public of information, opinions or ideas? Global Witness say that

The ruling confirms that the Section 32 exemption for journalism in the Data Protection Act applies to anyone engaged in public-interest reporting, not just the conventional media

but it is not immediately clear from where they import the “public-interest” aspect – this does not appear, at least not in explicit terms, in either the Directive or the DPA. It is possible that it can be inferred, when one considers that processing for special purposes which is not in the public interest might constitute an interference with respect for data subjects’ fundamental rights and freedoms (per recital 2 of the Directive). And, of course, with talk about public interest journalism, we walk straight back into the arguments provoked by the Leveson inquiry.

Furthermore, one notes that the Directive talks about exemption for processing of personal data carried out solely for journalistic purposes, and the DPA says “personal data which are processed only for the special purposes are exempt…”. This was why I emphasised the words in the Satamedia judgment quoted above, which talks similarly of the exemption applying if the “sole object of those activities is the disclosure to the public of information, opinions or ideas”. One might ask whether a campaigning group’s sole or only purpose for processing personal data is for journalism. Might they not, in processing the data, be trying to achieve further ends? Might, in fact, one say that the people who engage solely in the disclosure to public of information, opinions or ideas are in fact those we more traditionally think of in these terms…the press, the commercial journalists?

P.S. Global Witness have uploaded a copy of the ICO’s decision letter. This clarifies that the latter was satisfied that the former was processing for the special purposes because it was part of “campaigning journalism” even though the proposed future publication of the information “forms part of a wider campaign to promote a particular cause”. This chimes with the ICO’s data protection guidance for the media, but it will be interesting if it is challenged on the basis that it doesn’t support a view that the processing is “only” or “solely” for the special purposes.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

5 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner, journalism, Leveson

Chris Graham and the cost of FOI tribunals

When Information Commissioner (IC) Christopher Graham speaks, people listen. And so they should: he is the statutory regulator of the Freedom of Information Act 2000 (FOIA) whose role is “to uphold information rights in the public interest”. A speech by Graham is likely be examined carefully, to see if it gives indications of future developments, and this is the reason I am slightly concerned by a particular section of his recent speech at an event in Scotland looking at ten years of the Scottish FOI Act.

The section in question dealt with his envy of his Scottish counterparts. They, he observed, have relatively greater resources, and the Scottish Information Commissioner, unlike him, has a constitutional status that bolsters her independence, but also he envied

the simple and straightforward appeals mechanism in the Scottish legislation. The Scottish Commissioner’s decision is final, subject only to an appeal to the Court of Session on a point of law.

By contrast, in England, Wales and Northern Ireland, under section 57 of FOIA, there is a right of appeal to a tribunal (the First-tier Tribunal (Information Rights)). Under section 58(2) the Tribunal may review any finding of fact by the IC – this means that the Tribunal is able to substitute its own view for that of the commissioner. In Scotland, by contrast, as Graham indicates, the commissioner’s decision is only able to be overturned if it was wrong as a matter of law.

But there is another key difference arising from the different appellate systems: an appeal to the Tribunal is free, whereas in Scotland an application to the Court of Session requires a fee to be paid (currently £202). Moreover, a court is a different creature to a tribunal: the latter aims to “adopt procedures that are less complicated and more informal” and, as Sir Andrew Leggatt noted in his key 2001 report Tribunals for Users: One System, One Service

Tribunals are intended to provide a simple, accessible system of justice where users can represent themselves

It is very much easier for a litigant to represent herself in the Information tribunal, than it would be in a court.

Clearly, the situation as it currently obtains in England, Wales and Northern Ireland – free right of appeal to a Tribunal which can take a merits view of the case – will lead to more appeals, but isn’t that rather the point? There should be a straightforward way of challenging the decisions of a regulator on access to information matters. Graham bemoans that he is “having to spend too much of my very limited resources on Tribunals and lawyers” but I could have more sympathy if it was the case that this was purely wasted expenditure – if the appeals made were futile and changed nothing – but the figures don’t bear this out. Graham says that this year there have been 179 appeals; I don’t know where his figures are from, but from a rough totting-up of the cases listed on the Tribunal’s website I calculated that there have been about 263 decisions promulgated this year, of which 42 were successful. So, very far from showing an appeal to be a futile exercise, these figures suggest that approximately 1 in 5 was successful (at least in the first instance). What is also notable though, is the small but significant number of consent orders – nine this year. A consent order will result where the parties no longer contest the proceedings, and agree on terms to conclude them. It is speculation on my part but I would be very interested to know how many of those nine orders resulted from the IC deciding on the arguments submitted that his position was no longer sustainable.

What I’m getting at is that the IC doesn’t always get things right in the first instance; therefore, a right of appeal to an independent fact-finding tribunal is a valuable one for applicants. I think it is something we should be proud of, and we should feel sorry for FOI applicants in Scotland who are forced into court litigation (and proving an error of law) in order to challenge a decision there.

Ultimately, the clue to Graham’s disapproval of the right of appeal to Tribunal lies in the words “limited resources”. I do sympathise with his position – FOI regulation is massively underfunded by the government, and I rather suspect that, with better resourcing, Graham would take a different view. But I think his speech was particularly concerning because the issue of whether there should be a fee for bringing a case in the Tribunal was previously raised by the government, in its response to post-legislative scrutiny of FOIA. Things have gone rather quiet on this since, but might Graham’s speech herald the revival of such proposals?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under access to information, Freedom of Information, Information Commissioner, Information Tribunal

Hidden data in FOI disclosures

The Hackney Gazette reports that details of 15,000 residents have been published on the internet after Hackney Council apparently inadvertently disclosed the data when responding to a Freedom of Information (FOI) request made using the WhatDoTheyKnow site.

This is not the first time that such apparently catastrophic inadvertent disclosures have happened through WhatDoTheyKnow, and, indeed, in 2012 MySociety, who run the site, issued a statement following a similar incident with Islington Council. As that made clear

responses sent via WhatDoTheyKnow are automatically published online without any human intervention – this is the key feature that makes this site both valuable and popular

It is clearly the responsibility of the authorities in question to ensure that no hidden or exempt information is included in FOI disclosures via WhatDoTheyKnow, or indeed, in FOI disclosures in general. A failure to have appropriate organisational and technical safeguards in place can lead to enforcement action by the Information Commissioner’s Office for contraventions of the Data Protection Act 1998 (DPA): Islington ended up with a monetary penalty notice of £70,000 for their incident, which involved 2000 people. Although the number of data subjects involved is not the only factor the ICO will take into account when deciding what action to take, it is certainly a relevant one: 15000 affected individuals is a hell of a lot.

What concerns me is this sort of thing keeps happening. We don’t know the details of this incident yet, but with such large numbers of data subjects involved it seems likely that it will have involved some sort of dataset, and I would not be at all surprised if it involved purportedly masked or hidden data, such as in a pivot table [EDIT – I’m given to understand that this incident involved cached data in MS Excel]. Around the time of the Islington incident the ICO’s Head of Policy Steve Wood published a blog post drawing attention to the risks. A warning also takes the form of a small piece on a generic page about request handling, which says

take care when using pivot tables to anonymise data in a spreadsheet. The spreadsheet will usually still contain the detailed source data, even if this is hidden and not immediately visible at first glance. Consider converting the spreadsheet to a plain text format (such as CSV) if necessary.

This is fine, but does it go far enough? Last year I wrote on the Guardian web site, and called for greater efforts to be made to highlight the issue. I think that what I wrote then still holds

The ICO must work with the government to offer advice direct to chief executives and those reponsible for risk at councils and NHS bodies (and perhaps other bodies, but these two sectors are probably the highest risk ones). So far these disclosure errors do not appear to have led to harm to those individuals whose private information was compromised, but, without further action, I fear it is only a matter of time.

Time will tell whether this Hackney incident results in a finding of DPA contravention, and ICO enforcement, but in the interim I wish the word would get spread around about how to avoid disclosing hidden data in spreadsheets.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, Freedom of Information, Information Commissioner, monetary penalty notice

The Twelve Days of FOI Christmas

For fans of contrived, awful-punning seasonal blog posts that take 20 times longer to write than you imagined when you started, I present…

On the first day of Xmas FOI revealed to me…cartridges for the army

On the second day of Xmas FOI revealed to me two turtle docs and cartridges for the army

On the third day of Xmas FOI revealed to me 3 pinched hens*, two turtle docs and cartridges for the army

On the fourth day of Xmas FOI revealed to me four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the fifth day of Christmas FOI revealed to me FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the sixth day of Christmas FOI revealed to me Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the seventh day of Christmas FOI revealed to me Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the eighth day of Christmas FOI revealed to me Eight-year-olds Bilking, Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the ninth day of Christmas FOI revealed to me Nine  Babies’ chances, Eight-year-olds Bilking, Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the tenth day of Christmas FOI revealed to me Ten Lords-a-Judging, Nine Babies’ chances, Eight-year-olds Bilking, Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the eleventh day of Christmas FOI revealed to me Eleven-plus deciding,Ten Lords-a-Judging, Nine Babies’ chances, Eight-year-olds Bilking, Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

On the twelfth day of Christmas FOI revealed to me Twelve-Tonne Containers, Eleven-plus deciding,Ten Lords-a-Judging, Nine Babies’ chances, Eight-year-olds Bilking, Seven Dons-a-Sinning, Six Tree Inspections, FIVE GOLD THINGS, four NADPO nerds, 3 pinched hens, two turtle docs and cartridges for the army

*3 large maram hens, page 9, if you were wondering

Leave a comment

Filed under Freedom of Information, nonsense