Douglas Carswell MP is a data controller.
It says so on the Information Commissioner’s register:
(I hope he remembers to renew the registration when it expires next week it’s a criminal offence to process personal data as a data controller without a registration, unless you have an exemption).
But, more directly, he is a data controller because as an MP he is a person who determines the purposes for which and the manner in which the personal data of his constituents is processed. Sensible guidance for MPs is provided by Parliament itself
A Member is the data controller for all personal data that is handled by their office and they have overall responsibility for ensuring that this is done in accordance with the DPA.
I have already written recently raising some concerns about Carswell’s alleged handling of constituents’ personal data. But this week he decided to leave the Conservative Party, resign his seat, and seek re-election as a member of the UKIP party. James Forsyth, in the Daily Mail, talks about the constituency knowledge Carswell will bring to UKIP, and reports that “one senior Ukip figure purrs: ‘The quality of Douglas’s data is amazing'”.
The second data protection principle requires that
Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes
A person’s political opinions are “sensitive personal data”, afforded even greater protection under the DPA. It is not difficult to understand the historical basis for this, nor, indeed, the current basis for its still being so. Data protection law is in part an expression of and development of rights which were recognised by the drafters of the Universal Declaration of Human Rights and European Convention on Human Rights. Oppression of people on the basis of their politics was and remains distressingly common.
If I gave my data to help the Tories and found it was being used to help UKIP I’d be livid
UPDATE: Paul Bernal has written a superb piece on the broader ethical issues engaged here.
The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.
For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.
Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:
A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…
…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.
Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court
(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).
I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say
A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.
Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).
I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)
Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.
The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?
Mr Justice Cranston has suggested that there is a public interest factor when considering whether disclosure of personal data would be “fair” processing. I’m not sure that is right.
The first data protection principle (DPP1) in Schedule 1 of the Data Protection Act 1998 (DPA) says that personal data must be processed “fairly” (and lawfully). But what does “fairly” mean?
In an interesting recent case (AB v A Chief Constable  EWHC 1965 (QB)) the High Court determined that, on the very specific facts, it would not be fair, in terms of DPP1, and common law legitimate expectation, for a Chief Constable to send a second, non-standard, reference to the new employer of a senior police officer who was subject to disciplinary investigation. (The judgment merits close reading – this was by no means a statement of general principle about police references). The reason it would not be fair was because the officer in question had tendered his resignation upon the sending of the initial, anodyne, reference, and the force had terminated misconduct proceedings:
He was thus in the position that for the Force to send the second reference would most likely leave him without employment and without the opportunity to refute the gross misconduct allegations. In these special circumstances it would be a breach of the Data Protection Act 1998 and undermine his legitimate expectations for the second reference to be sent [¶94]
Something in particular struck me about the judge’s analysis of DPP1, although, given the outcome, it was not determinative. He rejected a submission from the claimant officer that the duty of fairness in the DPP1 and the European Data Protection Directive was a duty to be fair primarily to the data subject. Rather, correctly identifying that the privacy rights in the Directive and the DPA are grounded in article 8 of the European Convention on Human Rights and in general principles of EU law, he held that
The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]
I am not sure this is right. Recital 28 of the Directive says
Whereas any processing of personal data must be lawful and fair to the individuals concerned [emphasis added]
and recital 38 suggests that whether processing is “fair” is in large part dependent on whether the data subject is made aware of the processing and the circumstances under which it takes place. These recitals give way to the descriptions in Articles 10 and 11 which both talk about “fair processing in respect of the data subject” (again, emphasis added). Similarly Part II of Schedule One to the DPA provides interpretation to DPP1, and says that in determining whether personal data are processed fairly
regard is to be had to the method by which they are obtained, including in particular whether any person from whom they are obtained is deceived or misled as to the purpose or purposes for which they are to be processed
Admittedly this introduces “any person”, which could be someone other than the data subject, but more general considerations of public interest are absent. It is also notable that the Information Commissioner’s position in guidance seems predicated solely on the belief that it is the data subject’s interests that are engaged in an analysis of “fairness”, although the guidance does conceded that processing might cause some detriment to the individual without it being unfair, but I do not think this is the same as taking into account public interest in disclosure.
To the extent that a public interest test does manifest itself in DPP1, it is normally held to be in the conditions in Schedules 2 and 3. DPPP1 says that, in addition to the obligation to process personal data fairly and lawfully, a condition in Schedule 2 (and, for sensitive personal data, Schedule 3) must be met. Many of these conditions contain tests as to whether the processing is “necessary”, and that “necessity test” constitutes a proportionality test, as described by Latham LJ in Corporate Officer of the House of Commons v The Information Commissioner & Ors  EWHC 1084 (Admin)
‘necessary’…should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
To import a public interest test into the word “fairly” in DPP1 seems to me to be a potentially radical step, especially when disclosures of personal data under the Freedom of Information Act 2000 (FOIA) are being considered. As I say – I doubt that this is correct, but I would welcome any contrary (or concurring) opinions.
(By the way, I at first thought there was a more fundamental error in the judgment: the judge found that a rule of law was engaged which ordinarily would have required the Chief Constable to send the second reference:
the public law duty of honesty and integrity would ordinarily have demanded that the Chief Constable send the Regulatory Body something more than the anodyne reference about the claimant [¶93]
If a rule of law necessitates disclosure of personal data, then the exemption at section 35 DPA removes the requirement to process that data fairly and lawfully. However, I think the answer lies in the use of the word “ordinarily”: in this instance the doctrine of legitimate expectation (which the claimant could rely upon) meant that the public law duty to send the second reference didn’t apply. So section 35 DPA wasn’t engaged.)
Nominal damages give rise to distress compensation under the Data Protection Act – AB v Ministry of Justice
An award of nominal DPA damages in the High Court.
Whether, or in what circumstances, compensation may be awarded to a claimant who shows a contravention by a data controller of any of the requirements of the Data Protection Act 1998 (DPA), is a much-debated issue. It is also, occasionally, litigated. One key aspect is when compensation for distress might be awarded.
Section 13 of the DPA provides, so far as is relevant here, that
(1)An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.
(2)An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if—
(a)the individual also suffers damage by reason of the contravention
The general interpretation of this has been that compensation for distress, in the absence of pecuniary damage, is not available. The leading case on this is Johnson v The Medical Defence Union Ltd (2)  EWHC 321 and on appeal Johnson v Medical Defence Union  EWCA Civ 262, with Buxton LJ saying in the latter
section 13 distress damages are only available if damage in the sense of pecuniary loss has been suffered
However in allowing an appeal in Murray v Big Pictures (UK) Ltd  EWCA Civ 446, and directing that the case go to trial, the Court of Appeal was prepared to consider a different view
It seems to us to be at least arguable that the judge [in the first instance] has construed ‘damage’ too narrowly, having regard to the fact that the purpose of the Act was to enact the provisions of the relevant Directive
But that case was ultimately settled before trial, and the issue left undecided.
Clearly, the decision in Johnson is potentially controversial, especially in cases (of which Johnson was not one) where the UK’s obligations under the European Data Protection Directive, and data subjects’ associated rights under the European Convention on Human Rights and the Charter of Fundamental Rights of the European Union, are taken into account. This much was recognised by Tugendhat J, in giving permisssion to the applicants in Vidal -Hall & Ors v Google Inc  EWHC 13 (QB) to serve on Google Inc out of jurisdiction. He noted (¶83-104) academic statements on the issue, as well as the European Commission’s view that the UK DPA wrongly restricts “[t]he right to compensation for moral damage when personal information is used inappropriately”, and said
This is a controversial question of law in a developing area, and it is desirable that the facts should be found. It would therefore be the better course in the present case that I should not decide this question on this application.
I shall therefore not decide it. However, in case it is of any assistance in the future, my preliminary view of the question is that Mr Tomlinson’s submissions are to be preferred, and so that damage in s.13 does include non-pecuniary damage
This is a fascinating point, and detailed judicial consideration of it would be welcomed (it may also be at issue in the impending case of Steinmetz v Global Witness Ltd) but, in the meantime, a question exists as to whether nominal pecuniary damage opens the door to awards for distress. In Johnson, the cost of a £10.50 breakfast had opened the door, but this was actual (if minor) damage. Last year, the Court of Appeal avoided having to decide the issue when the defendant conceded the point in Halliday v Creation Consumer Finance Ltd (CCF)  EWCA Civ 333 (about which I blogged last year). However, in a very recent judgment, AB v Ministry of Justice  EWHC 1847 (QB), which takes some wading through, Mr Justice Baker does appear to have proceeded on the basis that nominal damages do give rise to distress compensation.
The case involves an (anonymous) partner in a firm of solicitors who, as a result of events involving the coroner following his wife’s tragic death, made a series of subject access requests (under the provisions of section 7 DPA). The Ministry of Justice (MoJ) did not, it seems, necessarily handle these well, nor in accordance with their obligations under the DPA, and when it came to remedying these contraventions (which consisted of delayed responses) the judge awarded nominal damages of £1.00, before moving on to award £2250 for distress caused by the delays.
What is not clear from the judgment is to what extent the judge considered the MoJ’s submission that compensation for distress was only available if an individual has also suffered damage. The answer may lie in the fact that, although he awarded nominal damages, the judge accepted that AB had suffered (actual) damage but had “not sought to quantify his time or expense”. Query, therefore, whether this is a case of purely nominal damage.
One hopes that Vidal-Hall and Global Witness give the occasions to determine these matters. One notes, however, the vigour with which both cases are being litigated by the parties: it may be some time before the issue is settled once and for all.
Does data protection law prevent the disclosure under the FOI Act of the identities of prisoners who have absconded?
The Mail reported recently that the Ministry of Justice (MoJ) had refused to disclose, in response to a request made under the Freedom of Information Act 2000 (FOIA), a list of prisoners who have absconded from open prisons. The MoJ are reported to have claimed that
under Freedom of Information laws, there is a blanket ban on releasing the criminals’ identities because it is their own ‘personal data’
but the Justice Secretary Chris Grayling was reported to be
furious with the decision, which was taken without his knowledge. He is now intending to over-rule his own department and publish a list of all on-the-run criminals within days
and sure enough a few days later the Mail was able to report, in its usual style, the names of the majority of the prisoners after Grayling
intervened to end the ‘nonsense’ of their names being kept secret…[and stated] that data protection laws will not be used to protect them, arguing: “They are wanted men and should be treated as such. That’s why on my watch we will not hold back their names, unless the police ask us not to for operational reasons”
Regarding the initial article, and in fairness to the MoJ, the Mail does not publish either the FOI request, nor the response itself, so it is difficult to know whether the latter was more nuanced than the article suggests (I suspect it was), but is it correct that disclosure of this information was prevented by data protection law?
More information was given in a follow-up piece on the Press Gazette website which cited a spokeswoman from the MoJ’s National Offender Management Service’s Security Group:
She said the department was “not obliged” to provide information that would contravene the Data Protection Act, adding, “for example, if disclosure is unfair”, which also meant that it did not have to consider “whether or not it would be in the public interest” to release the information
This is technically correct: FOIA provides an exemption to disclosure if the information requested constitutes personal data and disclosure would be in contravention of the Data Protection Act 1998 (DPA), there is no “public interest test” under this exemption, and whether disclosure is unfair is a key question. The reference to “fairness” relates to the first data protection principle in Schedule One to the DPA. This provides that
Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless—
(a)at least one of the conditions in Schedule 2 is met, and
(b)in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met
As the Information Commissioner’s Office says (page 13 of this guidance) “fairness can be a difficult concept to define”, and assessing it in a FOIA context will involve whether the information is “sensitive personal data” (it is in this instance – section 2 of the DPA explains in terms that data about prison sentences is included in this category); what the possible consequences of disclosure are on the individual; what the individual’s reasonable expectations are; and the balance of the interests of the public against the rights of the individual (this last example shows that there is, in effect, if not in actuality, there is a kind of public interest test for the FOIA personal data exemption).
With this in mind, would it really have been “unfair” to disclose the identities of on-the-run prisoners? The consequences of disclosure might be recapture (although I concede there might also be exposure to risk of attack by members of the public), but does an absconder really have a reasonable expectation that their identity will not be disclosed? I would argue they have quite the opposite – a reasonable expectation (even if they don’t desire it) that their identity will be disclosed. And the balance of public interest against the absconders’ rights surely tips in favour of the former – society has a compelling interest in recapturing absconders.
But this doesn’t quite take us to the point of permitting disclosure of this information under FOIA. If we look back to the wording of the first data protection principle we note that a condition in both Schedule Two (and, this being sensitive personal data) Schedule Three must be met. And here we note that most of those conditions require that the processing (and FOIA disclosure would be a form of processing) must be “necessary”. The particular conditions which seem to me most to be engaged are the identically worded 5(a) in Schedule Two, and 7(1)(a) in Schedule Three:
The processing is necessary for the administration of justice
What “necessary” means, in the context of a balance between the FOIA access rights and the privacy rights of individual has been given much judicial analysis, notably in the MPs’ expenses case (Corporate Officer of the House of Commons v The Information Commissioner & Ors  EWHC 1084 (Admin)), where it was said that “necessary”
should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
In this way “necessary” in the DPA, accords with the test in Article 8 of the European Convention on Human Rights, which provides that any interference with the right to respect for private and family life etc. must be
necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others [emphasis added]
Deciding whether there was a “pressing social need” to disclose, under FOIA, the absconders’ identities to the Mail was not straightforward, and no doubt the civil servants at MoJ erred on the side of caution. I can imagine them thinking that, if it was necessary in a democratic society to publish these names, they already would be published as routine, and the fact that they hadn’t meant that it would not be proportionate to disclose under FOIA (I happen to think that would be wrong, but that’s not strictly relevant). But this is an interesting case in which the subsequent intervention by the Justice Secretary created the justification which perhaps did not exist when the FOIA request was being handled: after all, if the Justice Secretary feels so strongly about publishing the names, then doing so must be necessary in the interests of public safety etc.
As it was, five of the names (out of eighteen) were not disclosed, no doubt for the police operational reasons that were alluded to by Grayling. And this, of course, points to the most likely, and the most strong, exemptions to disclosure of this sort of information – those relating to likely prejudice to law enforcement (section 31 FOIA).
p.s. I am given to understand that the Information Commissioner’s Office may be contacting the MoJ to discuss this issue.
Should Chris Packham’s admirable attempts to expose the cruelties of hunting in Malta be restrained by data protection law? And who is protected by the data protection exemption for journalism?
I tend sometimes to lack conviction, but one thing I am pretty clear about is that I am not on the side of people who indiscriminately shoot millions of birds, and whose spokesman tries to attack someone by mocking their well-documented mental health problems. So, when I hear that the FNKF, the Maltese “Federation for Hunting and Conservation” has
presented a judicial protest against the [Maltese] Commissioner of Police and the Commissioner for Data Protection, for allegedly not intervening in “contemplated” or possible breaches of privacy rules
with the claim being that they have failed to take action to prevent
BBC Springwatch presenter Chris Packham [from] violating hunters’ privacy by “planning to enter hunters’ private property” and by posting his video documentary on YouTube, which would involve filming them without their consent
My first thought is that this is an outrageous attempt to manipulate European privacy and data protection laws to try to prevent legitimate scruting of activities which sections of society find offensive and unacceptable. It’s my first thought, and my lasting one, but it does throw some interesting light on how such laws can potentially be used to advance or support causes which might not be morally or ethically attractive. (Thus it was that, in 2009, a former BNP member was prosecuted under section 55 the UK Data Protection Act 1998 (DPA 1998) for publishing a list of party members on the internet. Those members, however reprehensible their views or actions, had had their sensitive personal data unlawfully processed, and attracted the protection of the DPA (although the derisory £200 fine the offender received barely served as a deterrent)).
I do not profess to being an expert in Maltese Data Protection law, but, as a member state of the European Union, Malta was obliged to implement Directive EC/95/46 on the Protection of Individuals with regard to the Processing of Personal Data (which it did in its Data Protection Act of 2001). The Directive is the bedrock of all European data protection law, generally containing minimum standards which member states must implement in domestic law, but often allowing them to legislate beyond those minimum standards.
It may well be that the activities of Chris Packham et al do engage Maltese data protection law. In fact, if, for instance, film footage or other information which identifies individuals is recorded and broadcast in other countries in the European Union, it would be likely to constitute an act of “processing” under Article 2(b) of the Directive which would engage data protection law in whichever member state it was processed.
Data protection law at European level has a scope whose potential breadth has been described as “breath-taking”. “Personal data” is “any information relating to an identified or identifiable natural person” (that is “one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity”), and “processing” encompasses “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction”.
However, the broad scope does not necessarily means broad prohibitions on activities involving processing. Personal data must be processed “fairly and lawfully”, and can (broadly) be processed without the data subject’s consent in circumstances where there is a legal obligation to do so, or where it is necessary in the public interest, or necessary where the legitimate interests of the person processing it, or of a third party, outweigh the interests for fundamental rights and freedoms of the data subject. These legitimising conditions are implemented into the Maltese Data Protection Act 2001 (at section 9), so it can be seen that the FKNF’s claim that Packham requires the hunters’ consent to film might not have legs.
Moreover, Article 9 of the Directive, transposed in part at section 6 of the 2001 Maltese Act, provides for an exemption to most of the general data protection obligations where the processing is for journalistic purposes, which almost certainly be engaged for Packham’s activities. Whether, however, any other Maltese laws might apply is, I’m afraid, well outside my area of knowledge.
But what about activists who might not normally operate under the banner of “journalism”? What if Packham were, rather than a BBC journalist/presenter, “only” a naturalist? Would he be able to claim the journalistic data protection exemption?
Some of these sorts of issues are currently edging towards trial in litigation brought in the UK, under the DPA 1998, by a mining corporation (or, in its own words, a “diversified natural resources business”), BSG Resources, against Global Witness, an NGO one of whose stated goals is to “expose the corrupt exploitation of natural resources and international trade systems”. BSGR’s claims are several, but are all made under the DPA 1998, and derive from the fact they have sought to make subject access requests to Global Witness to know what personal data of the BSGR claimants is being processed, for what purposes and to whom it is being or may be disclosed. Notably, BSGR have chosen to upload their grounds of claim for all to see. For more background on this see the ever-excellent Panopticon blog, and this article in The Economist.
This strikes me as a potentially hugely significant case, firstly because it illustrates how data protection is increasingly being used to litigate matters more traditionally seen as being in the area of defamation law, or the tort of misuse of private information, but secondly because it goes to the heart of questions about what journalism is, who journalists are and what legal protection (and obligations) those who don’t fit the traditional model/definition of journalism have or can claim.
I plan to blog in more detail on this case in due course, but for the time being I want to make an observation. Those who know me will not have too much trouble guessing on whose side my sympathies would tend to fall in the BSGR/Global Witness litigation, but I am not so sure how I would feel about extending journalism privileges to, say, an extremist group who were researching the activities of their opponents with a view to publishing those opponents’ (sensitive) personal data on the internet. If society wishes to extend the scope of protection traditionally afforded to journalists to political activists, or citizen bloggers, or tweeters, it needs to be very careful that it understands the implications of doing so. Freedom of expression and privacy rights coexist in a complex relationship, which ideally should be an evenly balanced one. Restricting the scope of data protection law, by extending the scope of the exemption for journalistic activities, could upset that balance.
As I’m keen always to take a balanced view of important privacy issues, and not descend into the sort of paranoid raving which always defines, say, the state as the enemy, capable of almost anything, I sometimes think I end up being a bit naive, or at least having naive moments.
So, when outgoing Chair of Ofcom Dame Colette Bowe recently gave evidence to the House of Lords Select Committee on Communications, and said about consumers that
their smart TV may well have a camera and a microphone embedded in it there in their living room. What is that smart TV doing? Do people realise that this is a two-way street?
I thought for a moment “Oh come on, don’t be so scaremongering”. Sure, we saw the stories about Smart TVs and cookies, which is certainly an important privacy issue, but the idea that someone would use your TV to spy on you…?!
And then, of course, I quickly remembered – with a feeling of nausea – that that is exactly the sort of thing that GCHQ are alleged to have done, by jumping on the unencrypted web cam streams of Yahoo users, as part of the Optic Nerve program. And each time I remember this, it makes me want to scream “THEY WERE INDISCRIMINATELY SPYING ON PEOPLE…IN THEIR HOMES, IN THEIR BEDROOMS, FOR ****’S SAKE!”
And they were doing it just because they could. Because they’d notice a way – a vulnerability – and taken advantage of it to slurp masses of intensely private data, just in case it might prove useful in the future.
The intrusion, the prurience, the violation do indeed make me feel like raving against the state and its agents who, either through direct approval, or tacit acceptance, or negligence, allowed this to happen. Although *balance alert* GCHQ do, of course, assure us that “GCHQ insists all of its activities are necessary, proportionate, and in accordance with UK law”. So that’s OK. And yes, they really did call it “proportionate”.
I know the web cam grabbing was by no means the only such intrusion, but for me it exemplifies the “something” which went wrong, at some point, which led to this. I don’t know what that something was, or even how to fix it, and I’ve never used a web cam, so have no direct interest, but I will closely watch the progress of Simon Davies’ request for the Attorney General to refer the matter to the police.
On 28 February the Information Commissioner’s Office (ICO) served a Monetary Penalty Notice (MPN), pursuant to powers under section 55A of the Data Protection Act 1998 (DPA), on the British Pregnancy Advisory Service, in the sum of £200,000 (which would be reduced to £160,000 if promptly paid). The ICO’s new release explains
An ICO investigation found the charity didn’t realise its own website was storing the names, address, date of birth and telephone number of people who asked for a call back for advice on pregnancy issues. The personal data wasn’t stored securely and a vulnerability in the website’s code allowed [a] hacker to access the system and locate the information.
The hacker threatened to publish the names of the individuals whose details he had accessed, though that was prevented after the information was recovered by the police following an injunction obtained by the BPAS
The back story to this is that the hacker in question was subsequently jailed for 32 months for offences under the Computer Misuse Act 1990 (no doubt the prosecutors recognised that the criminal sanctions under the DPA were too weedy to bother with).
The section 55A DPA powers are triggered where there has been a qualifying serious contravention by a data controller of its obligations under section 4(4) to comply with the data protection principles in Schedule One. The most pertinent of these in the instant case (and in the large majority of ICO MPNs) was the seventh
Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data
which extends to the need to, when contracting with someone to process data on your behalf, require them to take equivalent security measures and evidence this contractual provision in writing. As the ICO’s MPN says
BPAS failed to take appropriate technical and organisational measures against the unauthorised processing of personal data stored on the BPAS website such as having a detailed specification about the parameters of the CMS to ensure that either the website did not store any personal data or alternatively, that effective and appropriate security measures were applied such as storing administrative passwords securely; ensuring stated standards of communication confidentiality were met; carrying out appropriate security testing on the website which would have alerted them to the vulnerabilities that were present or ensuring that the underlying software supporting the website was kept up to date
(Interestingly, the MPN also makes clear that there was a contravention of the fifth principle – which provides that “personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes”. This was because “the call back details were kept for five years longer than was necessary for [BPAS’s] purposes”).
The original crime was a particularly nasty one – the offender appears to have had an ideological, or at least personal, opposition to abortion in general, and the apparently very real threat to publish people’s details, given to BPAS in highly sensitive circumstances, is probably what elevated the BPAS contravention to a level which justifies such a high sum being served on a charity. However, BPAS have announced that they intend to appeal, and their press release about this is interesting. It suggests that the appeal will be not about the issuing of the MPN, but about its amount (section 55B(5) DPA permits appeals on either basis):
We accept that no hacker should have been able to steal our data but we are horrified by the scale of the fine
but it goes on to make the valid point that, by serving an MPN of this large amount, the ICO potentially gives the offender something that he wanted – to harm the charity:
It is appalling that a hacker who acted on the basis of his opposition to abortion should see his actions rewarded in this way
This, though, seems to be a matter of ethics, rather than law, but it will be interesting to note if the argument makes it in some form into the grounds of appeal. More likely, if the challenge is to be made solely on the amount (under section 55B(5)(b)), focus will fall on to the suggestion that
This fine seems out of proportion when compared with those levelled against other organisations who were not themselves the victims of a crime
Of course, by a circular argument, the “fine” would not have been served, if the data controller had not, by its omissions, permitted itself to be a victim of the crime.
An extra frisson is caused when one considers the compelling argument by the solicitor-advocates for Scottish Borders Council, who successfully helped the latter win an appeal of an MPN last year. Although their argument – that MPNs were more correctly to be considered criminal, as opposed to civil, penalties – did not fall to be decided by the First-tier Tribunal, it did observe that
One general question hovering over this appeal is whether proceedings in respect of monetary penalties are “criminal” in nature. There are certainly enough indications, not least in the title of the amending statute, [the Criminal Justice and Immigration Act 2008] to make an arguable case for them being so…We have concluded that there is no need for us to make any decision or pronouncement in the abstract; but there is a need for us to be vigilant to ensure that the proceedings are fair
If this line of argument continues to be developed – that recipients of MPNs are entitled to be afforded the equivalent rights to fairness, of hearing under Article 6 of the European Convention on Human Rights, afforded to those accused of crimes – then MPNs, and the circumstances and manner in which they are served, may be subject to a much greater level of scrutiny, and the cash-strapped ICO may find itself under even more pressure from legal challenges.
These issues may be aired, and possibly determined, in the forthcoming appeal on the Upper Tribunal of the MPN served on Christopher Niebel, and subsequently overturned by the First-tier Tribunal.
Doyen of information rights bloggers, Tim Turner, has written in customary analytic detail on how the current NHS care.data leafleting campaign was not necessitated by data protection law, and on how, despite some indications to the contrary, GPs will not be in the Information Commissioner’s firing line if they fail adequately to inform patients about what will be happening to their medical data.
He’s right, of course: where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) then it is not obliged, pace the otherwise very informative blogpost by the Information Commissioner’s Dawn Monaghan, to give data subjects a privacy, or fair processing notice.
(In passing, and in an attempt to outnerd the unoutnerdable, I would point out that Tim omits that, by virtue of The Data Protection (Conditions under Paragraph 3 of Part II of Schedule 1) Order 2000, if a data subject properly requests a privacy notice in circumstances where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) and would, thus, otherwise not be required to issue one, the data controller must comply2.)
Tim says, though
The leaflet drop is no way to inform people about such a significant step, but I don’t think it is required
That appears to be true, under data protection law, but, under broader obligations imposed on the relevant authorities under Article 8 of the European Convention on Human Rights (ECHR), as incorporated in domestic law in the Human Rights Act 1998, it might not be so (and here, unlike with data protection law, we don’t have to consider the rigid controller/processor dichotomy in order to decide who the relevant, and liable, public authority is, and I would suggest that NHS England (as the “owner of the care.data programme” in Dawn Monaghan’s words) seems the obvious candidate, but GPs might also be caught).
In 1997 the European Court of Human Rights addressed the very-long-standing concept of the confidentiality of doctor-patient relations, in the context of personal medical data, in Z v Finland (1997) 25 EHRR 371, and said
the Court will take into account that the protection of personal data, not least medical data, is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life as guaranteed by Article 8 of the Convention (art. 8). Respecting the confidentiality of health data is a vital principle in the legal systems of all the Contracting Parties to the Convention. It is crucial not only to respect the sense of privacy of a patient but also to preserve his or her confidence in the medical profession and in the health services in general…Without such protection, those in need of medical assistance may be deterred from revealing such information of a personal and intimate nature as may be necessary in order to receive appropriate treatment and, even, from seeking such assistance, thereby endangering their own health and, in the case of transmissible diseases, that of the community
This, I think, nicely encapsulates why so many good and deep-thinking people have fundamental concerns about care.data.
Now, I am not a lawyer, let alone a human rights lawyer, but it does occur to me that a failure to inform patients about what would be happening with their confidential medical records when GP’s were required to upload them, and a failure to allow them to opt-out, would have potentially infringed patients’ Article 8 rights. We should not forget that, initially, there was no intention to inform patients at all (there had no attempt to inform patients about the similar upload of hospital medical data, which has been going on for over twenty years). It is, surely, possible therefore, that NHS England is not just “helping” GPs to inform patients without having any responsibility to do so (as Dawn Monaghan suggests), but that it recognises its potential vulnerability to an Article 8 challenge, and is trying to avoid or mitigate this. Whether the leaflets themselves, and the campaign to deliver them, are adequate to achieve this aim is another matter. As has been noted, the leaflet contains no opt out form, and there seem to be numerous examples of people (often vulnerable people, for instance in care homes, or refuges) who will have little or no chance of receiving a copy.
At the launch of the tireless MedConfidential campaign last year, Shami Chakrabarti, of Liberty, spoke passionately about the potential human rights vulnerabilities of the care.data programme. Notifying patients of what is proposed might not have been necessary under data protection law, but it is quite possible that the ECHR aspect of doing so was one of the things on which the Health and Social Care Information Centre (HSCIC) has been legally advised. Someone made an FOI request for this advice last year, and it is notable that HSCIC seem never to have completed their response to the request.
1I make no apologies for linking to one of Larkin’s most beautiful, but typically bleak and dystopian, pieces of prose, but I would add that it finishes “…These have I tried to remind of the excitement of jazz, and tell where it may still be found.”
2Unless the data controller does not have sufficient information about the individual in order readily to determine whether he is processing personal data about that individual, in which case the data controller shall send to the individual a written notice stating that he cannot provide the requisite information because of his inability to make that determination, and explaining the reasons for that inability
ICO confirms hashtag campaign prior to conviction was unlikely to be compliant with the Data Protection Act. Other forces to be advised via ACPO of issues raised by the case
Over the Christmas period Staffordshire Police ran a social media campaign, in which drivers arrested and charged with drink-driving offences were named on twitter with the “hashtag” #drinkdriversnamedontwitter. It seemed to me, and others, that this practice arguably suggested guilt prior to any trial or conviction. As I said at the time
If someone has merely been charged with an offence, it is contrary to the ancient and fundamental presumption of innocence to shame them for that fact. Indeed, I struggle to understand how it doesn’t constitute contempt of court to do so, or to suggest that someone who has not been convicted of drink-driving is a drink driver
and I asked the Information Commissioner’s Office (ICO)
whether the practice is compliant with Staffordshire Police’s obligations under the first data protection principle (Schedule 1 of the Data Protection Act 1998 (DPA)) to process personal data fairly and lawfully
The ICO have now issued a statement. Their spokesman says
The ICO spoke to Staffordshire Police following its #DrinkDriversNamedOnTwitter campaign. Our concern was that naming people who have only been charged alongside the label ‘drink driver’ strongly implies a presumption of guilt for the offence, which we felt wouldn’t fit with the Data Protection Act’s fair and lawful processing principle.
We have received reassurances from Staffordshire Police that the hashtag will no longer be used in this way, and are happy with the procedures they have in place. As a result, we will be taking no further action. We’ve also spoken with ACPO about making other police forces aware of the issues raised by this case.
I think this is a very satisfactory result. The ICO have, as I said previously, shown that they are increasingly willing to investigate contraventions of the DPA not limited to security breaches. No one would defend drink driving (and it was not the naming itself that was objectionable, but the tweeting of the names in conjunction with the hashtag) but the police should not be free to indicate or imply guilt prior to conviction – that is quite simply contrary to the rule of law.
What I still think is disappointing though, is that after an initial prompt response from the Attorney General’s twitter account (which missed my point), there has been no word from them as to whether the practice was potentially prejudicial to any forthcoming trial. Maybe they’d like to rethink this, in light of the statement from the ICO?