FOI disclosure of personal data: balancing of interests

In June this year I blogged about the case of AB v A Chief Constable (Rev 1) [2014] EWHC 1965 (QB). In that case, Mr Justice Cranston had held that, when determining whether personal data is being or has been processed “fairly” (pursuant to the first principle of Schedule One of the Data Protection Act 1998 (DPA))

assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I was surprised by this reading in of an interests balance to the first principle, and said so in my post. Better people than I disagreed, and I certainly am even less sure now than I was of the correctness of my view.

In any case, the binding authority of the High Court rather trumps my meanderings, and it is cited in a recent decision of the First-tier Tribunal (Information Rights) in support of a ruling that the London Borough of Merton Council must disclose, under the Freedom of Information Act 2000 (FOIA), an email sent to a cabinet member of that council by Stephen Hammond MP. The Tribunal, in overturning the decision of the Information Commissioner, considered the private interests of Mr Hammond, including the fact that he had objected to the disclosure, but felt that these did not carry much weight:

we do not consider anything in the requested information to be particularly private or personal and that [sic] this substantially weakens the weight of interest in nondisclosure…We accept that Mr Hammond has objected to the disclosure, which in itself carries some weight as representing his interests. However, asides from an expectation of a general principle of non-disclosure of MP correspondence, we have not been given any reason for this. We have been given very little from the Commissioner to substantiate why Members of Parliament would have an expectation that all their correspondence in relation to official work remain confidential

and balanced against these were the public interests in disclosure, including

no authority had been given for the statement [in the ICO’s decision notice] that MPs expect that all correspondence to remain confidential…[;]…withholding of the requested information was not compatible with the principles of accountability and openness, whereby MPs should subject themselves to public scrutiny, and only withhold information when the wider public interest requires it…[;]…the particular circumstances of this case [concerning parking arrangements in the applicant’s road] made any expectation of confidentiality unreasonable and strongly indicated that disclosure would be fair

The arguments weighed, said the Tribunal, strongly in favour of disclosure.

A further point fell to be considered, however: for processing of personal data to be fair and lawful (per the first data protection principle) there must be met, beyond any general considerations, a condition in Schedule Two DPA. The relevant one, condition 6(1) requires that

The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject

It has to be noted that “necessary” here in the DPA imports a human rights proportionality test and it “is not synonymous with ‘indispensable’…[but] it implies the existence of a ‘pressing social need'” (The Sunday Times v United Kingdom (1979) 2 EHRR 245). The Tribunal, in what effectively was a reiteration of the arguments about general “fairness”, accepted that the condition would be met in this case, citing the applicant’s arguments, which included the fact that

disclosure is necessary to meet the public interest in making public what Mr Hammond has said to the Council on the subject of parking in Wimbledon Village, and that as an elected MP, accountable to his constituents, disclosure of such correspondence cannot constitute unwarranted prejudice to his interests.

With the exception of certain names within the requested information, the Tribunal ordered disclosure.  Assessing “fairness” now, following Mr Justice Cranston, and not following me, clearly does involve balancing the interests of the data subject against the public interest in disclosure.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner, Information Tribunal

Making an FOI request to oneself…

Can the executive of a local authority make an FOI request to itself?

The Brighouse Echo reveals that Stephen Baines (no relation, of course), the Leader of Calderdale Council, resorted to submitting a Freedom of Information (FOI) request in exasperation, after apparently failing to get answers from officers at the Council

I asked officers on November 10 if there was there was any truth in these allegations [about officers ignoring warnings about the legality of a parking scheme], and I hadn’t received a reply, and last Friday I’d had enough – I finally lost it and put in a Freedom of Information request. It’s highly probable that I’m the first council leader to have done this, but I was just getting so frustrated.

But did he need to make an FOI request? In fact, could he even make an FOI request?

I would say that it is strongly arguable that in a council operating executive arrangements – as Calderdale does – under part 9C(3) of the Local Government Act 2000 (LGA 2000), whereby a Leader with a Leader-appointed Cabinet constitute the executive, the executive are deemed generally to be in control of information relating to the council’s functions. So in general terms, the Leader and Cabinet are “the Council”. Section 9D(3) of LGA 2000 provides that “any function of the local authority which is not specified in regulations…is to be the responsibility of an executive of the authority under executive arrangements” (the regulations in question are The Local Authorities (Functions and Responsibilities) (England) Regulations 2000 (as amended). Put another way, the executive are the ones who should take any decision on access to documents, rather than officers (other than officers who have had that decision delegated to them). The exceptions to this general principle would be where the documents relate to functions which are not the responsibility of the executive. Effectively, the executive will be the possessors/controllers of all council information for which the executive has the functional responsibility.

I feel bolstered in this suggestion by Part 5 of The Local Authorities (Executive Arrangements) (Meetings and Access to Information) (England) Regulations 2012. This gives “Additional rights of [access of] members of the local authority and of members of overview and scrutiny committees” and sections 16 and 17 talk in terms of the right of a member, or a member of an overview and scrutiny committee, to inspect certain documents which are “in the possession or under the control of the executive of a local authority”. No interpretative guide is given to what “in the possession or under the control of the executive of a local authority” means, but it is clear that there must be a category of documents which are “in the possession or under the control of the executive of a local authority”. That being the case, one might ask “which documents are not ‘in the possession or under the control of the executive of a local authority’?” To which I am tempted to answer “those which do not relate to the functions for which the executive has responsibility”.

So, if it is, for instance, a function of a local authority to provide library services (section 7 of the Public Libraries and Museums Act 1964).  This function is the responsibility of the executive (because regulations do not specify otherwise). Delivery of the function will normally be by delegation to officers, but I cannot see how those officers, or others, could then restrict a member of the executive from seeing a document relating to the exercise of executive functions. And if, as I understand is the case, civil enforcement of parking contraventions is also an executive functions (surely delegated to officers) one wonders also if officers can restrict a Leader from seeing a document relating to the exercise of that specific function.

So, my argument goes, a leader of a council cannot make an FOI request to the council for information about the exercise of an executive functions, because in that regard he is the council. Comments welcomed!

And n.b. I have not even begun to consider where a councillor’s, or a leader’s, common law right to know fits in to this…

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

6 Comments

Filed under Freedom of Information, local government

The wrong test for anonymisation?

UPDATE: 23.01.15 The ICO has responded [.doc file] to my request for a review of their decision. I drew their attention to the arguments on this page but they don’t even mention them, let alone provide a counter-analysis, in dismissing my complaints (“Having reviewed the matter, I agree with the explanations provided”). I am invited by the ICO to consider taking my own legal action. I understand that the ICO and I might have differing views on a DPA matter, but what I find difficult to accept is the refusal even to enter into a discussion with me about the detailed arguments I’ve made. END UPDATE

In February this year I asked the Information Commissioner’s Office (ICO) to investigate reports that Hospital Episode Statistics (HES) data had apparently been sold to an actuarial society by the NHS Information Centre (NHSIC), the predecessor to the Health and Social Care Information Centre (HSCIC). Specifically I requested, as a data subject can under s42 of the Data Protection Act 1998 (DPA), that the ICO assess whether it was likely or not that the processing of my personal data by NHSIC and others had been in compliance with the DPA.

Nine months later, I was still awaiting the outcome. But a clue to how the assessment would turn out was contained in the text of Sir Nick Partridge’s six month review of various data releases by NHSIC (his original report in June seemed to me to point to multiple potential DPA contraventions). In the review document he says

Six investigations have been separately instigated by the HSCIC or Information Commissioner’s Office (ICO)and shared with both parties as these focussed on whether individuals were at risk of being identified. In the cases it has investigated, the ICO has upheld the HSCIC approach and informed us that it has “seen no evidence to suggest that re-identification has occurred or is reasonably likely to occur.”
And sure enough, after chasing the ICO for the outcome of my nine-month wait, I received this (in oddly formatted text, which rather whiffed of a lot of cutting-and-pasting)
Following the recent issue regarding HSCIC, PA Consulting, and Google we investigated the issue of whether HES data could be considered personal data. This detailed work involved contacting HSCIC, PA Consulting, and Google and included the analysis of the processes for the extraction and disclosure of HES data both generally and in that case in particular. We concluded that we did not consider that the HES dataset constitutes personal data.Furthermore we also investigated whether this information had been linked to other data to produce “personal data” which was subject to the provisions of the Act. We have no evidence that there has been any re-identification either on the part of PA Consulting or Google. We also noted that HSCIC have stated that the HES dataset does not include individual level patient data even at a pseudonymised level. Our view is that the data extracted and provided to PA Consulting did not identify any individuals and there was no reasonable likelihood that re-identification would be possible.
I have added the emphasis to the words “reasonable likelihood” above. They appear in similar terms in the Partridge Review, and they struck me as rather odd. An awful lot of analysis has taken and continues to take place on the subject of when can personal data be “rendered fully anonymous in the sense that it is information from which the data subject is no longer identifiable” (Lord Hope’s dicta in Common Services Agency v Scottish Information Commissioner [2008] UKHL 47). Some of that analysis has been academic, some takes the form of “soft law” guidance, for instance Opinion 05/2014 of the Article 29 Working Party, and the ICO Anonymisation Code of Practice. The former draws on the Data Protection Directive 95/46/EC, and notes that

Recital 26 signifies that to anonymise any data, the data must be stripped of sufficient elements such that the data subject can no longer be identified. More precisely, that data must be processed in such a way that it can no longer be used to identify a natural person by using “all the means likely reasonably to be used”

Anonymisation has also been subject to judicial analysis, notably in the Common Services Agency case, but, even more key, in the judgment of Mr Justice Cranston in Department of Health v Information Commissioner ([2011] EWHC 1430). The latter case, involving the question of disclosure of late-term abortion statistics, is by no means an easy judgment to parse (ironically so, given that it makes roughly the same observation of the Common Services Agency case). The judge held that the First-tier Tribunal had been wrong to say that the statistics in question were personal data, but that it had on the evidence been entitled to say that “the possibility of identification by a third party from these statistics was extremely remote”. The fact that the possibility of identification by a third party was extremely remote meant that “the requested statistics were fully anonymised” (¶55). I draw from this that for personal data to be anonymised in statistical format the possibility of identification of individuals by a third party must be extremely remote. The ICO’s Anonymisation Code, however, says of the case:

The High Court in the Department of Health case above stated that the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA [emphasis added]

But this seems to me to be an impermissible description of the case – the High Court did not state what the ICO says it stated – the phrases “greater than remote” and “reasonably likely” do not appear in the judgment. And that phrase “reasonably likely” is one that, as I say, makes it way into the Partridge Review, and the ICO’s assessment of the lawfulness of HES data “sale”.

I being to wonder if the ICO has taken the phrase from recital 26 of the Directive, which talks about the need to consider “all the means likely reasonably to be used” to identify an individual, and transformed it into a position from which, if identification is not reasonably likely, it will accept that data are anonymised. This cannot be right: there is a world of difference between a test which considers whether possibility of identification is “extremely remote” and whether it is “reasonably likely”.

I do not have a specific right to a review of the section 42 assessment decision that the processing of my personal data was likely in compliance with NHSIC’s obligations under the DPA, but I have asked for one. I am aware of course that others complained (après moi, la deluge) notably, in March, FIPR, MedConfidential and Big Brother Watch . I suspect they will also be pursuing this.

In October this year I attended an event at which the ICO’s Iain Bourne spoke. Iain was a key figure in the drawing up of the ICO’s Anonymisation Code, and I took the rather cheeky opportunity to ask about the HES investigations. He said that his initial view was that NHSIC had been performing good anonymisation practice. This reassured me at the time, but now, after considering this question of whether the Anonymisation Code (and the ICO) adopts the wrong test on the risks of identification, I am less reassured. Maybe “reasonably likely that an individual can be identified” is an appropriate test for determining when data is no longer anonymised, and becomes personal data, but it does not seem to me that the authorities support it.

Postscript Back in August of this year I alerted the ICO to the fact that a local authority had published open data sets which enabled individuals to be identified (for instance, social care and housing clients). More than four months later the data is still up (despite the ICO saying they would raise the issue with the council): is this perhaps because the council has argued that the risk of identification is not “reasonably likely”?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under anonymisation, care.data, Data Protection, Directive 95/46/EC, Information Commissioner, NHS

Russell Brand and the domestic purposes exemption in the Data Protection Act

Was a now-deleted tweet by Russell Brand, revealing a journalist’s private number, caught by data protection law?

Data protection law applies to anyone who “processes” (which includes “disclosure…by transmission”) “personal data” (data relating to an identifiable living individual) as a “data controller” (the person who determines the purposes for which and the manner in which the processing occurs). Rather dramatically, in strict terms, this means that most individuals actually and regularly process personal data as data controllers. And nearly everyone would be caught by the obligations under the Data Protection Act 1998 (DPA), were it not for the exemption at section 36. This provides that

Personal data processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes) are exempt from the data protection principles and the provisions of Parts II and III

Data protection nerds will spot that exemption from the data protection principles and Parts II and III of the DPA is effectively an exemption from whole Act. So in general terms individuals who restrict their processing of personal data to domestic purposes are outwith the DPA’s ambit.

The extent of this exemption in terms of publication of information on the internet is subject to some disagreement. On one side is the Information Commissioner’s Office (ICO) who say in their guidance that it applies when an individual uses an online forum purely for domestic purposes, and on the other side are the Court of Justice of the European Union (and me) who said in the 2003 Lindqvist case that

The act of referring, on an internet page, to various persons and identifying them by name or by other means, for instance by giving their telephone numberconstitutes ‘the processing of personal data…[and] is not covered by any of the exceptionsin Article 3(2) of Directive 95/46 [section 36 of the DPA transposes Article 3(2) into domestic law]

Nonetheless, it is clear that publishing personal data on the internet for reasons not purely domestic constitutes an act of processing to which the DPA applies (let us assume that the act of publishing was a deliberate one, determined by the publisher). So when the comedian Russell Brand today decided to tweet a picture of a journalist’s business card, with an arrow pointing towards the journalist’s mobile phone number (which was not, for what it’s worth, already in the public domain – I checked with a Google search) he was processing that journalist’s personal data (note that data relating to an individual’s business life is still their personal data). Can he avail himself of the DPA domestic purposes exemption? No, says the CJEU, of course, following Lindqvist. But no, also, would surely say the ICO: this act by Brand was not purely domestic. Brand has 8.7 million twitter followers – I have no doubt that some will have taken the tweet as an invitation to call the journalist. It is quite possible that some of those calls will be offensive, or abusive, or even threatening.

Whilst I have been drafting this blog post Brand has deleted the tweet: that is to his credit. But of course, when you have so many millions of followers, the damage is already done – the picture is saved to hard drives, is mirrored by other sites, is emailed around. And, I am sure, the journalist will have to change his number, and maybe not much harm will have been caused, but the tweet was nasty, and unfair (although I have no doubt Brand was provoked in some way). If it was unfair (and lacking a legal basis for the publication) it was in contravention of the first data protection principle which requires that personal data be processed fairly and lawfully and with an appropriate legitimating condition. And because – as I submit –  Brand cannot plead the domestic purposes exemption, it was in contravention of the DPA. However, whether the journalist will take any private action, and whether the ICO will take any enforcement action, I doubt.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner, journalism, social media

Naming and shaming the innocent

Around this time last year I wrote two blog posts about two separate police forces’ decision to tweet the names of drivers charged (but not – yet, at least – convicted) of drink driving offences. In the latter example Staffordshire police were actually using a hashtag #drinkdriversnamedontwitter, and I argued that

If someone has merely been charged with an offence, it is contrary to the ancient and fundamental presumption of innocence to shame them for that fact. Indeed, I struggle to understand how it doesn’t constitute contempt of court to do so, or to suggest that someone who has not been convicted of drink-driving is a drink driver. Being charged with an offence does not inevitably lead to conviction. I haven’t been able to find statistics relating to drink-driving acquittals, but in 2010 16% of all defendants dealt with by magistrates’ courts were either acquitted or not proceeded against

The Information Commissioner’s Office investigated whether there had been a breach of the first principle of Schedule One of the Data Protection Act 1998 (DPA), which requires that processing of personal data be “fair and lawful”, but decided to take no action after Staffs police agreed not to use the hashtag again, saying

Our concern was that naming people who have only been charged alongside the label ‘drink-driver’ strongly implies a presumption of guilt for the offence. We have received reassurances from Staffordshire Police the hashtag will no longer be used in this way and are happy with the procedures they have in place. As a result, we will be taking no further action.

But my first blog post had raised questions about whether the mere naming of those charged was in accordance with the same DPA principle. Newspaper articles talked of naming and “shaming”, but where is the shame in being charged with an offence? I wondered why Sussex police didn’t correct those newspapers who attributed the phrase to them.

And this year, Sussex police, as well as neighbouring Surrey, and Somerset and Avon are doing the same thing: naming drivers charged with drink driving offences on twitter or elsewhere online. The media happily describe this as a “naming and shaming” tactic, and I have not seen the police disabusing them, although Sussex police did at least enter into a dialogue with me and others on twitter, in which they assured us that their actions were in pursuit of open justice, and that they were not intending to shame people. However, this doesn’t appear to tally with the understanding of the Sussex Police and Crime Commissioner who said earlier this year

I am keen to find out if the naming and shaming tactic that Sussex Police has adopted is actually working

But I also continue to question whether the practice is in accordance with police forces’ obligations under the DPA. Information relating to the commission or alleged commission by a person of an offence is that person’s sensitive personal data, and for processing to be fair and lawful a condition in both of Schedule Two and, particularly, Schedule Three must be met. And I struggle to see which Schedule Three condition applies – the closest is probably

The processing is necessary…for the administration of justice
But “necessary”, in the DPA, imports a proportionality test of the kind required by human rights jurisprudence. The High Court, in the MPs’ expenses case cited the European Court of Human Rights, in The Sunday Times v United Kingdom (1979) 2 EHRR 245  to the effect that

while the adjective “necessary”, within the meaning of article 10(2) [of the European Convention on Human Rights] is not synonymous with “indispensable”, neither has it the flexibility of such expressions as “admissible”, “ordinary”, “useful”, “reasonable” or “desirable” and that it implies the existence of a “pressing social need.”
and went on to hold, therefore that “necessary” in the DPA

should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
So is there a pressing social need to interfere with the rights of people charged with (and not convicted of) an offence, in circumstances where the media and others portray the charge as a source of shame? Is it proportionate and fairly balanced to do so? One consideration might be whether the same police forces name all people charged with an offence. If the intent is to promote open justice, then it is difficult to see why one charging decision should merit online naming, and others not.But is the intent really to promote open justice? Or is it to dissuade others from drink-driving? Supt Richard Corrigan of Avon and Somerset police says

This is another tool in our campaign to stop people driving while under the influence of drink or drugs. If just one person is persuaded not to take to the road as a result, then it is worthwhile as far as we are concerned.

and Sussex police’s Chief Inspector Natalie Moloney says

I hope identifying all those who are to appear in court because of drink or drug driving will act as a deterrent and make Sussex safer for all road users

which firstly fails to use the word “alleged” before “drink or drug driving”, and secondly – as Supt Corrigan – suggests the purpose of naming is not to promote open justice, but rather to deter drink drivers.

Deterring drink driving is certainly a worthy public aim (and I stress that I have no sympathy whatsoever with those convicted of such offences) but should the sensitive personal data of who have not been convicted of any offence be used to their detriment in pursuance of that aim?

I worry that unless such naming practices are scrutinised, and challenged when they are unlawful and unfair, the practice will spread, and social “shame” will be encouraged to be visited on the innocent. I hope the Information Commissioner investigates.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, human rights, Information Commissioner, Open Justice, police, social media

ICO confirm they are considering enforcement action over #samaritansradar app

FOI response from ICO refuses disclosure of correspondence with Samaritans because it could prejudice ongoing investigations

On 12 November I asked the Information Commissioner’s Office to disclose to me, under the Freedom of Information Act (FOIA) information relating to their assessment of the legality of the “Samaritans Radar” app (see blog posts passim).

The ICO have now responded to me, refusing to disclose because of the FOIA exemption for “law enforcement”. As the ICO say

The exemption at section 31(1)(g) of the FOIA refers to circumstances
where the disclosure of information “would, or would be likely to,
prejudice – … the exercise by any public authority of its functions for
any of the purposes specified in subsection (2).”

The purposes referred to in sections 31(2)(a) and (c) are –

“(a) the purpose of ascertaining whether any person has failed to comply
with the law” and

“(c) the purpose of ascertaining whether circumstances which would
justify regulatory action in pursuance of any enactment exist or may arise
…”

Clearly, these purposes apply when the Information Commissioner is
considering whether or not an organisation has breached the Data Protection Act

But the exemption is subject to a public interest test, and the ICO acknowledge that there is public interest in the matter, particularly in how Samaritans have responded to their enquiries. Nonetheless, as the investigation is ongoing, and as no decision has apparently been made about whether enforcement action should be taken, the balance in the public interest test falls on the side of non-disclosure.

The question of potential enforcement action is an interesting one. Although the ICO have power to serve monetary penalty notices (to a maximum of £500,000) they can also issue enforcement notices, requiring organisations (who are data controllers, as I maintain Samaritans were for the app) to cease or not begin processing personal data for specific purposes. They also can ask data controllers to sign undertakings to take or not take specific action. This is of interest because Samaritans have indicated that they might want to launch a reworked version of the app.

It is by no means certain that enforcement action will result – the ICO are likely to be reluctant to enforce against a generally admirable charity – but the fact that it is being considered is in itself of interest.

The ICO acknowledge that the public interest in maintaining this particular exemption wanes once the specific investigation has been completed. Consequently I have asked them, outwith FOIA, to commit to disclosing this information proactively once the investigation has finished. They have no obligation to do so, but it would be to the benefit of public transparency, which their office promotes, if they did.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, enforcement, Freedom of Information, Information Commissioner

PARKLIFE! (and a £70k monetary penalty)

In August this year I reported that the Information Commissioner’s Office (ICO) had effectively conceded it had no current powers to issue monetary penalties on spam texters. This was after the Upper Tribunal had indicated that in most cases the sending of such texts was not likely to cause substantial damage or substantial distress (this being part of the statutory test for serving a monetary penalty notice (MPN) for a serious contravention of the Privacy and Electronic Communications (EC Directive) Regulations 2003) (PECR).

What I’d forgotten were the reports of highly distasteful and in some cases highly distressing texts sent in May to festival-goers by the organisers of the Parklife festival in Manchester’s Heaton Park. The texts didn’t disclose that they were from the event organisers, but instead purported to come from “Mum” and were advertising extra events at the festival.

Regulation 23 of PECR outlaws the sending of direct marketing texts (and other direct marketing electronic communications) where the sender’s identity has been disguised or concealed.

As the Manchester Evening News reported at the time receiving the texts in question left many recipients who had lost their mothers distressed and upset.

And so it came to pass that, as the same newspaper reveals today, the ICO investigated complaints about the marketing, and appears to have determined that the sending of the texts was a serious contravention of PECR regulation 23, and it was of a kind likely to cause substantial distress. The paper reveals that an MPN of £70000 has been served on the organisers, and the ICO has confirmed this on its website, and the MPN itself lists a number of the complaints made by affected recipients.

So, I, and the ICO’s Steve Eckersley, were wrong – powers to serve MPNs for spam texts do still currently exist, although it must be said that this was an exceptional case: most spam texts are irritating, rather than as callous and potentially distressing as these. And this is why the Ministry of Justice is, as I have previously discussed, consulting on lowering, or dropping altogether, the “harm threshold” for serving MPNs for serious PECR contraventions.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under enforcement, Information Commissioner, marketing, monetary penalty notice, PECR, spam texts

Does Simon Hughes really want to receive FOI complaints?

At an event on the evening of 26 November, to celebrate (slightly early) the ten year anniversary of the Freedom of Information Act 2000 (FOIA) the Minister of State for Justice and Civil Liberties, Simon Hughes, appeared to offer to take on part of the Information Commissioner’s regulatory role.

The event, hosted at the RSA by the Commissioner himself, brought together a panel of FOIA luminaries consisting of Deputy Information Commissioner Graham Smith, the BBC’s Martin Rosenbaum, Scottish Information Commissioner Rosemary Agnew and Hughes himself. In response to a question from the floor about the considerable delays and obstructiveness by certain public authorities in dealing with FOIA requests, Hughes invited people to send him examples, so that he could start to compile data on compliance (of the sort already being compiled by Agnew’s office).

Astute eyebrows at the event (and possibly on the panel) were raised: dealing with miscreant public authorities is a role clearly assigned to the Information Commissioner. For the Minister to invite complaints seems to be to risk usurping that role. One wonders if he knows what he’s let himself in for.

7 Comments

Filed under FOISA, Freedom of Information, Information Commissioner, Ministry of Justice

Watching the detective

The ICO might be mostly powerless to take action against the operators of the Russian web site streaming unsecured web cams, but the non-domestic users of the web cams could be vulnerable to enforcement action

The Information Commissioner’s Office (ICO) warned yesterday of the dangers of failing to secure web cams which are connected to the internet. This was on the back of stories about a Russian-based web site which aggregates feeds from thousands of compromised cameras worldwide.

This site was drawn to my attention a few weeks ago, and, although I tweeted obliquely about it, I thought it best not to identify it because of the harm it could potentially cause. However, although most news outlets didn’t identify the site, the cat is now, as they say, out of the bag. No doubt this is why the ICO chose to issue sensible guidance on network security in its blog post.

I also noticed that the Information Commissioner himself, Christopher Graham, rightly pointed to the difficulties in shutting down the site, and the fact that it is users’ responsibility to secure their web cams:

It is not within my jurisdiction, it is not within the European Union, it is Russia.

I will do what I can but don’t wait for me to have sorted this out.

This is, of course, true, and domestic users of web cams would do well to note the advice. Moreover, this is just the latest of these aggregator sites to appear. But news reports suggested that some of the 500-odd (or was it 2000-odd?) feeds on the site from the UK were from cameras of businesses or other non-domestic users (I saw a screenshot, for instance, of a feed from a pizza takeaway). Those users, if their web cams are capturing images of identifiable individuals, are processing personal data in the role of a data controller. And they can’t claim the exemption in the Data Protection Act 1998 (DPA) that applies to processing for purely domestic purposes. They must, therefore comply with the seventh data protection principle, which requires them to take appropriate measures to safeguard against unauthorised and unlawful processing of personal data. Allowing one’s web can to be compromised and its feed streamed on a Russian website is a pretty good indication that one is not complying with the seventh principle. Serious contraventions of the obligation to comply with the data protection principles can, of course, lead to ICO enforcement action, such as monetary penalty notices, to a maximum of £500,000.

The ICO is not, therefore, completely powerless here. Arguably it should be (maybe it is?) looking at the feeds on the site to determine which are from non-domestic premises, and looking to take appropriate enforcement action against them. So to that extent, one is rather watching Mr Graham, to see if he can sort this out.

2 Comments

Filed under Data Protection, Information Commissioner, Privacy

The voluntary data controller

One last post on #samaritansradar. I hope.

I am given to understand that Samaritans, having pulled their benighted app, have begun responding to the various legal notices people served on them under the Data Protection Act 1998 (specifically, these were notices under section 7 (subject access) section 10 (right to prevent processing likely to cause damage or distress) and section 12 (rights in relation to automated processing)). I tweeted my section 12 notice, but I doubt I’ll get a response to that, because they’ve never engaged with me once on twitter or elsewhere.

However, I have been shown a response to a section 7 request (which I have permission to blog about) and it continues to raise questions about Samaritans’ handling of this matter (and indeed, their legal advice – which hasn’t been disclosed, or even  really hinted at). The response, in relevant part, says

We are writing to acknowledge the subject access request that you sent to Samaritans via DM on 6 November 2014.  Samaritans has taken advice on this matter and believe that we are not a data controller of information passing through the Samaritans Radar app. However, in response to concerns that have been raised, we have agreed to voluntarily take on the obligations of a data controller in an attempt to facilitate requests made as far as we can. To this end, whilst a Subject Access Request made under the Data Protection Act can attract a £10 fee, we do not intend to charge any amount to provide information on this occasion.

So, Samaritans continue to deny being data controller for #samaritansradar, although they continue also merely to give assertions, not any legal analysis. But, notwithstanding their belief that they are not controller they are taking on the obligations of a data controller.

I think they need to be careful. A person who knowingly discloses personal data without the consent of the data controller potentially commits a criminal offence under section 55 DPA. One can’t just step in, grab personal data and start processing it, without acting in breach of the law. Unless one is a data controller.

And, in seriousness, this purported adoption of the role of “voluntary data controller” just bolsters the view that Samaritans have been data controllers from the start, for reasons laid out repeatedly on this blog and others. They may have acted as joint data controller with users of the app, but I simply cannot understand how they can claim not to have been determining the purposes for which and the manner in which personal data were processed. And if they were, they were a data controller.

 

Leave a comment

Filed under Data Protection, social media