Category Archives: human rights

FOIA’s not the only route

News emerges of a potential judicial review attempt to force disclosure of government Brexit papers not under FOI but under common law and human rights to information

More than three years ago the Supreme Court handed down judgment in a long-running piece of litigation under the Freedom of Information Act 2000 (FOIA). Journalist Dominic Kennedy had attempted to get disclosure from the Charity Commission of information relating to inquiries into George Galloway’s “Mariam Appeal”. The Commission said, in effect, that the absolute exemption to disclosure at section 32(2) of FOIA was the end of the story, while Kennedy argued that Article 10 of the European Convention on Human Rights imposed a positive obligation of disclosure on public authorities, particularly when the requester was a “public watchdog” like the press, and that s32(2) should be read down accordingly to require disclosure in the circumstances (I paraphrase). In his leading opinion Lord Mance gave this stirring introduction:

Information is the key to sound decision-making, to accountability and development; it underpins democracy and assists in combatting poverty, oppression, corruption, prejudice and inefficiency. Administrators, judges, arbitrators, and persons conducting inquiries and investigations depend upon it; likewise the press, NGOs and individuals concerned to report on issues of public interest. Unwillingness to disclose information may arise through habits of secrecy or reasons of self-protection. But information can be genuinely private, confidential or sensitive, and these interests merit respect in their own right and, in the case of those who depend on information to fulfil their functions, because this may not otherwise be forthcoming. These competing considerations, and the balance between them, lie behind the issues on this appeal.

What was most interesting about the judgment in Kennedy, and, again, I disrespectfully heavily paraphrase, was that the Supreme Court basically said (as it has been wont to do in recent years) – “why harp on about your rights at European law, don’t you realise that our dear old domestic friend the common law gives you similar rights?”

the route by which [Mr Kennedy] may, after an appropriate balancing exercise, be entitled to disclosure, is not under or by virtue of some process of remodelling of section 32, but is under the Charities Act construed in the light of common law principles and/or in the light of article 10 of the Human Rights Convention, if and so far as that article may be engaged

This greatly excited those in the information rights field at the time, but since then, there has been little of prominence to advance the proposition that FOIA rights are not the only route [Ed. there’s a great/awful pun in there somewhere] but it did get a positive airing in R (Privacy International) v HMRC [2014] EWHC 1475 (Admin) (on which see Panopticon post here).

Yesterday (12 October) barrister Jolyon Maugham announced that his Good Law Project was seeking donors towards a judicial review application if the government refused to publish information and reports comparing the predicted economic harm of Brexit with the predicted economic benefits of alternative free trade agreements. Keen followers of information rights litigation will note that Tim Pitt-Payne  and Robin Hopkins are instructed: the potential respondents should quake in their boots.

Well worth watching this, and well worth – in my opinion – donating towards the cause.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Brexit, Freedom of Information, human rights, Open Justice

On some sandy beach

[EDITED 25.07.17 to include references to “sandpits” in the report of the Deepmind Health Independent Review Panel]

What lies behind the Information Commissioner’s recent reference to “sandbox regulation”?

The government minister with responsibility for data protection, Matt Hancock, recently spoke to the Leverhulme Centre. He touched on data protection:

a new Data Protection Bill in this Parliamentary Session…will bring the laws up to date for the modern age, introduce new safeguards for citizens, stronger penalties for infringement, and important new features like the right to be forgotten. It will bring the EU’s GDPR and Law Enforcement Directive into UK law, ensuring we are prepared for Brexit.

All pretty standard stuff (let’s ignore the point that the “right to be forgotten” such as it is, exists under existing law – a big clue to this being that the landmark case was heard by the CJEU in 2014). But Hancock went on to cite with approval some recent words of the Information Commissioner, Elizabeth Denham:

I think the ICO’s proposal of a data regulatory “sandbox” approach is very impressive and forward looking. It works in financial regulation and I look forward to seeing it in action here.

This refers to Denham’s recent speech on “Promoting privacy with innovation within the law”, in which she said

We are…looking at how we might be able to engage more deeply with companies as they seek to implement privacy by design…How we can contribute to a “safe space” by building a sandbox where companies can test their ideas, services and business models. How we can better recognise the circular rather than linear nature of the design process.

I thought this was interesting – “sandbox regulation” in the financial services sector involves an application to the Financial Conduct Authority (FCA), for the testing of “innovative” products that don’t necessarily fit into existing regulatory frameworks – the FCA will even where necessary waive rules, and undertake not to take enforcement action.

That this model works for financial services does not, though, necessarily mean it would work when it comes to regulation of laws, such as data protection laws, which give effect to fundamental rights. When I made enquiries to the Information Commissioner’s Office (ICO) for further guidance on what Denham intends, I was told that they “don’t have anything to add to what [she’s] already said about engaging with companies to help implement privacy by design”.

The recent lack of enforcement action by the ICO against the Royal Free NHS Trust regarding its deal with Google Deepmind raised eyebrows in some circles: if the unlawful processing of 1.6 million health records (by their nature sensitive personal data) doesn’t merit formal enforcement, then does anything?

Was that a form of “sandbox regulation”? Presumably not, as it doesn’t appear that the ICO was aware of the arrangement prior to it taking place, but if, as it seems to me, such regulation may involve a light-touch approach where innovation is involved, I really hope that the views and wishes of data subjects are not ignored. If organisations are going to play in the sand with our personal data, we should at the very least know about it.

**EDIT: I have had my attention drawn to references to “sandpits” in the Annual Report of the Deepmind Health Independent Review Panel:

We think it would be helpful if there was a space, similar to the ‘sandpits’ established by the Research Councils, which would allow regulators, the Department of Health and tech providers to discuss these issues at an early stage of product development. The protection of data during testing is an issue that should be discussed in a similar collaborative forum. We believe that there must be a mechanism that allows effective testing without compromising confidential patient information.

It would seem a bit of a coincidence that this report should be published around the same time Denham and Hancock were making their speeches – and I would argue that this only bolsters the case for more transparency from the ICO about how this type of collaborative regulation will take place.

And I notice that the Review Panel say nothing about involving data subjects in “product development”. Until “innovators” understand that data subjects are the key stakeholder in this, I don’t hold out much hope for the proper protection of rights.**

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, enforcement, human rights, Information Commissioner

Data Protection (and other) compensation awarded against Ombudsman

I’ve been helpfully referred to a rather remarkable judgment of the Leeds County Court, in a claim for damages against the Local Government Ombudsman for, variously, declaratory relief and damages arising from discrimination under the Equality Act 2010, and breach of the Data Protection Act 1998 (DPA). The claim was resoundingly successful, and led to a total award of £12,500, £2,500 of which were aggravated damages because of the conduct of the trial by the respondent.

The judgment has been uploaded to Dropbox here.

I will leave readers to draw their own conclusions about the actions of the Ombudsman, but it’s worth noting, when one reads the trenchant criticism by District Judge Geddes, that one of the office’s strategic objectives is to

deliver effective redress through impartial, rigorous and proportionate investigations

One can only conclude that, in this case at least, this objective was very far from met.

Of particular relevance for this blog, though, was the award of £2500 for distress arising from failure to prepare and keep an accurate case file recording the disability of the claimant and her daughter. This, held the District Judge, was a contravention of the Ombudsman’s obligations under the DPA. As is now relatively well known, the DPA’s original drafting precluded compensation for distress alone (in the absence of tangible – e.g. financial – damage), but the Court of Appeal, in Vidal Hall & ors v Google ([2015] EWCA Civ 311), held that this was contrary to the provisions of the Charter of Fundamental Rights of the European Union and that, accordingly, there was a right under the DPA to claim compensation for “pure” distress. The award in question here was of “Vidal Hall” compensation, with the judge saying there was

no doubt in my mind that the data breaches have caused distress to the claimant in their own rights as well as as a result of the consequences that flowed.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under 7th principle, accuracy, Data Protection, human rights, local government

Anti-EU campaign database – in contravention of data protection laws?

The politics.co.uk site reports that an anti-EU umbrella campaign called Leave.EU (or is it theknow.eu?) has been written to by the Information Commissioner’s Office (ICO) after allegedly sending unsolicited emails to people who appear to have been “signed up” by friends or family. The campaign’s bank-roller, UKIP donor Aaron Banks, reportedly said

We have 70,000 people registered and people have been asked to supply 10 emails of friends or family to build out (sic) database

Emails sent to those signed up in this way are highly likely to have been sent in breach of the campaign’s obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and the ICO is reported to have to written to the campaign to

inform them of their obligations under the PECR and to ask them to suppress [the recipient’s] email address from their databases

But is this really the main concern here? Or, rather, should we (and the ICO) be asking what on earth is a political campaign doing building a huge database of people, and identifying them as (potential) supporters without their knowledge? Such concerns go to the very heart of modern privacy and data protection law.

Data protection law’s genesis lie, in part, in the desire, post-war, of European nations to ensure “a foundation of justice and peace in the world”, as the preamble to the European Convention on Human Rights states. The first recital to the European Community Data Protection Directive of 1995 makes clear that the importance of those fundamental rights to data protection law.

The Directive is, of course, given domestic effect by the Data Protection Act 1998 (DPA). Section 2 of the same states that information as to someone’s political beliefs is her personal data: I would submit that presence on a database purporting to show that someone supports the UK”s withdrawal from the European Union is also her personal data. Placing someone on that database, without her knowledge or ability to object, will be manifestly “unfair” when it comes to compliance with the first data protection principle. It may also be inaccurate, when it comes to compliance with the fourth principle.

I would urge the ICO to look much more closely at this – the compiling of (query inaccurate) of secret databases of people’s political opinions has very scary antecedents.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Directive 95/46/EC, Europe, human rights, Information Commissioner

Naming and shaming the innocent

Around this time last year I wrote two blog posts about two separate police forces’ decision to tweet the names of drivers charged (but not – yet, at least – convicted) of drink driving offences. In the latter example Staffordshire police were actually using a hashtag #drinkdriversnamedontwitter, and I argued that

If someone has merely been charged with an offence, it is contrary to the ancient and fundamental presumption of innocence to shame them for that fact. Indeed, I struggle to understand how it doesn’t constitute contempt of court to do so, or to suggest that someone who has not been convicted of drink-driving is a drink driver. Being charged with an offence does not inevitably lead to conviction. I haven’t been able to find statistics relating to drink-driving acquittals, but in 2010 16% of all defendants dealt with by magistrates’ courts were either acquitted or not proceeded against

The Information Commissioner’s Office investigated whether there had been a breach of the first principle of Schedule One of the Data Protection Act 1998 (DPA), which requires that processing of personal data be “fair and lawful”, but decided to take no action after Staffs police agreed not to use the hashtag again, saying

Our concern was that naming people who have only been charged alongside the label ‘drink-driver’ strongly implies a presumption of guilt for the offence. We have received reassurances from Staffordshire Police the hashtag will no longer be used in this way and are happy with the procedures they have in place. As a result, we will be taking no further action.

But my first blog post had raised questions about whether the mere naming of those charged was in accordance with the same DPA principle. Newspaper articles talked of naming and “shaming”, but where is the shame in being charged with an offence? I wondered why Sussex police didn’t correct those newspapers who attributed the phrase to them.

And this year, Sussex police, as well as neighbouring Surrey, and Somerset and Avon are doing the same thing: naming drivers charged with drink driving offences on twitter or elsewhere online. The media happily describe this as a “naming and shaming” tactic, and I have not seen the police disabusing them, although Sussex police did at least enter into a dialogue with me and others on twitter, in which they assured us that their actions were in pursuit of open justice, and that they were not intending to shame people. However, this doesn’t appear to tally with the understanding of the Sussex Police and Crime Commissioner who said earlier this year

I am keen to find out if the naming and shaming tactic that Sussex Police has adopted is actually working

But I also continue to question whether the practice is in accordance with police forces’ obligations under the DPA. Information relating to the commission or alleged commission by a person of an offence is that person’s sensitive personal data, and for processing to be fair and lawful a condition in both of Schedule Two and, particularly, Schedule Three must be met. And I struggle to see which Schedule Three condition applies – the closest is probably

The processing is necessary…for the administration of justice
But “necessary”, in the DPA, imports a proportionality test of the kind required by human rights jurisprudence. The High Court, in the MPs’ expenses case cited the European Court of Human Rights, in The Sunday Times v United Kingdom (1979) 2 EHRR 245  to the effect that

while the adjective “necessary”, within the meaning of article 10(2) [of the European Convention on Human Rights] is not synonymous with “indispensable”, neither has it the flexibility of such expressions as “admissible”, “ordinary”, “useful”, “reasonable” or “desirable” and that it implies the existence of a “pressing social need.”
and went on to hold, therefore that “necessary” in the DPA

should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
So is there a pressing social need to interfere with the rights of people charged with (and not convicted of) an offence, in circumstances where the media and others portray the charge as a source of shame? Is it proportionate and fairly balanced to do so? One consideration might be whether the same police forces name all people charged with an offence. If the intent is to promote open justice, then it is difficult to see why one charging decision should merit online naming, and others not.But is the intent really to promote open justice? Or is it to dissuade others from drink-driving? Supt Richard Corrigan of Avon and Somerset police says

This is another tool in our campaign to stop people driving while under the influence of drink or drugs. If just one person is persuaded not to take to the road as a result, then it is worthwhile as far as we are concerned.

and Sussex police’s Chief Inspector Natalie Moloney says

I hope identifying all those who are to appear in court because of drink or drug driving will act as a deterrent and make Sussex safer for all road users

which firstly fails to use the word “alleged” before “drink or drug driving”, and secondly – as Supt Corrigan – suggests the purpose of naming is not to promote open justice, but rather to deter drink drivers.

Deterring drink driving is certainly a worthy public aim (and I stress that I have no sympathy whatsoever with those convicted of such offences) but should the sensitive personal data of who have not been convicted of any offence be used to their detriment in pursuance of that aim?

I worry that unless such naming practices are scrutinised, and challenged when they are unlawful and unfair, the practice will spread, and social “shame” will be encouraged to be visited on the innocent. I hope the Information Commissioner investigates.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, human rights, Information Commissioner, Open Justice, police, social media

Monitoring of blogs and lawful/unlawful surveillance

Tim Turner wrote recently about the data protection implications of the monitoring of Sara Ryan’s blog by Southern Health NHS Trust. Tim’s piece is an exemplary analysis of how the processing of personal data which is in the public domain is still subject to compliance with the Data Protection Act 1998 (DPA):

there is nothing in the Data Protection Act that says that the public domain is off-limits. Whatever else, fairness still applies, and organisations have to accept that if they want to monitor what people are saying, they have to be open about it

But it is not just data protection law which is potentially engaged by the Trust’s actions. Monitoring of social media and networks by public authorities for the purposes of gathering intelligence might well constitute directed surveillance, bringing us explicitly into the area of human rights law. Sir Christopher Rose, the Chief Surveillance Commissioner said, in his most recent annual report

my commissioners remain of the view that the repeat viewing of individual “open source” sites for the purpose of intelligence gathering and data collation should be considered within the context of the protection that RIPA affords to such activity

“RIPA” there of course refers to the complex Regulation of Investigatory Powers Act 2000 (RIPA) (parts of which were reputedly “intentionally drafted for maximum obscurity”)1. What is not complex, however, is to note which public authorities are covered by RIPA when they engage in surveillance activities. A 2006 statutory instrument2 removed NHS Trusts from the list (at Schedule One of RIPA) of relevant public authorities whose surveillance was authorised by RIPA. Non-inclusion on the Schedule One lists doesn’t as a matter of fact or law mean that a public authority cannot undertake surveillance. This is because of the rather odd provision at section 80 of RIPA, which effectively explains that surveillance is lawful if carried out in accordance with RIPA, but surveillance not carried out in accordance with RIPA is not ipso facto unlawful. As the Investigatory Powers Tribunal put it, in C v The Police and the Home Secretary IPT/03/32/H

Although RIPA provides a framework for obtaining internal authorisations of directed surveillance (and other forms of surveillance), there is no general prohibition in RIPA against conducting directed surveillance without RIPA authorisation. RIPA does not require prior authorisation to be obtained by a public authority in order to carry out surveillance. Lack of authorisation under RIPA does not necessarily mean that the carrying out of directed surveillance is unlawful.

But it does mean that where surveillance is not specifically authorised by RIPA questions would arise about its legality under Article 8 of the European Convention on Human Rights, as incorporated into domestic law by the Human Rights Act 1998. The Tribunal in the above case went on to say

the consequences of not obtaining an authorisation under this Part may be, where there is an interference with Article 8 rights and there is no other source of authority, that the action is unlawful by virtue of section 6 of the 1998 Act.3

So, when the Trust was monitoring Sara Ryan’s blog, was it conducting directed surveillance (in a manner not authorised by RIPA)? RIPA describes directed surveillance as covert (and remember, as Tim Turner pointed out – no notification had been given to Sara) surveillance which is “undertaken for the purposes of a specific investigation or a specific operation and in such a manner as is likely to result in the obtaining of private information about a person (whether or not one specifically identified for the purposes of the investigation or operation)” (there is a further third limb which is not relevant here). One’s immediate thought might be that no private information was obtained or intended to be obtained about Sara, but one must bear in mind that, by section 26(10) of RIPA “‘private information’, in relation to a person, includes any information relating to his private or family life” (emphasis added). This interpretation of “private information” of course is to be read alongside the protection afforded to the respect for one’s private and family life under Article 8. The monitoring of Sara’s blog, and the matching of entries in it against incidents in the ward on which her late son, LB, was placed, unavoidably resulted in the obtaining of information about her and LB’s family life. This, of course, is the sort of thing that Sir Christopher Rose warned about in his most recent report, in which he went on to say

In cash-strapped public authorities, it might be tempting to conduct on line investigations from a desktop, as this saves time and money, and often provides far more detail about someone’s personal lifestyle, employment, associates, etc. But just because one can, does not mean one should.

And one must remember that he was talking about cash-strapped public authorities whose surveillance could be authorised under RIPA. When one remembers that this NHS Trust was not authorised to conduct directed surveillance under RIPA, one struggles to avoid the conclusion that monitoring was potentially in breach of Sara’s and LB’s human rights.

1See footnote to Caspar Bowden’s submission to the Intelligence and Security Committee
2The Regulation of Investigatory Powers (Directed Surveillance and Covert Human Intelligence Sources) (Amendment) Order 2006
3This passage was apparently lifted directly from the explanatory notes to RIPA

3 Comments

Filed under Data Protection, human rights, NHS, Privacy, RIPA, social media, surveillance, surveillance commissioner

RIPA errors…but also serious data protection breaches?

A circular from the Interception of Communications Commissioner’s Office raises concerns about some public authorities’ data protection compliance

The benighted (although often misrepresented) Regulation of Investigatory Powers Act 2000 (RIPA) had at least the ostensible worthy aim of ensuring that, when public authorities conducted investigations which were intrusive on people’s private lives, those investigations took place in accordance with the law. Thus, under Chapter II of Part 1 of RIPA, authorisations may be granted within an organisation to acquire, or an application made to require a postal or telecommunications operator to disclose, communications data (“communications data”, in the words of the Statutory Code of Practice “embraces the ‘who’, ‘when’ and ‘where’ of a communication but not the content, not what was said or written”). If the acquisition is done in accordance with RIPA, and the Code of Practice, it will in general terms be done lawfully.

The acquisition and disclosure of communications data under RIPA is overseen by the Interception of Communications Commissioner who is appointed pursuant to section 57 RIPA. It is the Commissioner’s role to review the exercise and performance of relevant persons’ functions under the Act. From time to time his office (IOCCO) will also issue circulars, and one such landed on the desks of Senior Responsible Officers of relevant public authorities earlier this month. Laudably, IOCCO has also uploaded it to its website and its contents are worrying not just because they indicate errors in complying with RIPA authorisations and applications, but also with the data protection compliance of the authorities involved. The circular, from the Head of IOCCO, Jo Cavan, states that

in the first six month period of the reporting year (January to June 2014) there have been 195 applicant errors – of which 153 (78%) were, according to the reports submitted to IOCCO, caused by the applicant submitting the wrong communications address. [emphasis in original]

As I say, the provisions of RIPA at least implicitly acknowledge that acquisition and disclosure of communications data will be highly intrusive actions. But failure to ensure that the data acquired is accurate means that such intrusion has taken place into the private communications of people totally uninvolved in the investigations being undertaken, as the circular highlights

In all cases the applicant error led to communications data being acquired relating to members of the public who had no connection to the investigation or operation being undertaken

but most chillingly

one of these errors led to executive action being taken against a member of the public who had no connection to the investigation being undertaken

Although no indication is given of what the deceptively bland phrase “executive action” actually consisted of.

The fourth principle in Schedule One of the Data Protection Act 1998 (DPA) requires in terms that data controllers take reasonable steps to ensure the accuracy of personal data they process. Failure to comply with that obligation potentially gives rise to civil claims by data subjects, and, in qualifying serious cases, civil enforcement action by the Information Commissioner’s Office, which can serve monetary penalty notices to a maximum of £500,000.  Moreover, the seventh principle in Schedule One of the DPA requires to data controllers to take appropriate technical and organisational measures to safeguard against the unfair or unlawful processing of personal data. IOCCO’s Circular notes that

It is unsatisfactory to note that the telephone numbers / email addresses / Internet Protocol (IP) addresses were, in the vast majority of cases, derived from records available to the applicant in electronic form and as such could have been electronically copied into the application to ensure accuracy. SROs must develop, implement and robustly enforce measures to require applicants to electronically copy communications addresses into applications when the source is in electronic form (for example forensic reports relating to mobile phones, call data records etc). Communications addresses acquired from other sources must be properly checked to reduce the scope for error. It is not acceptable for public authorities to simply state that applicants have been reminded to double check communications addresses to prevent recurrence

This points to possible failure by the authorities in question to take appropriate DPA principle 7 measures.

IOCCO’s enforcement powers in this regard are limited, although the circular notes that the Commissioner shall, where appropriate, notify affected individuals of the existence and role of the Investigatory Powers Tribunal (IPT) . However, complainants would not be restricted simply to complaining to the IPT – the Surveillance Roadmap (“a shared approach to the regulation of surveillance in the United Kingdom”) agreed between the UK’s surfeit of privacy commissioners, allows for the possibility of someone aggrieved by intrusive obtaining of communications data making a complaint to the Information Commissioner’s Office (ICO) as well as the IPT. It does state that “the ICO does not have the necessary [sic] powers to investigate breaches of RIPA and will only make a decision as to whether it is likely or unlikely that an organisation has complied with the DPA”, but it does strike me that a complaint to the ICO is a lot easier to make than an application to the IPT. Or, alternatively, a civil claim (under section 13 DPA) through the courts on the basis that the public authority in question had contravened its obligations opens up the possibility of a damages award. This might be a more attractive option for an complainant, because, although damages are a remedy available in the IPT (under s67(7) RIPA), it is notable that there is no right of appeal from an IPT decision (s67(8)).

One last point – the Surveillance Roadmap tries to draw lines separating the functions of the various commissioners. This is sensible, and aims to avoid overlap and duplication of functions, but one wonders if the ICO might be interested in looking at the DPA compliance of the authorities who erred so notably in the cases seen by IOCCO.

 

 

 

 

Leave a comment

Filed under Data Protection, human rights, Information Commissioner, RIPA

Data protection implications of MPs crossing the floor

Douglas Carswell MP is a data controller.

It says so on the Information Commissioner’s register:

carswell

(I hope he remembers to renew the registration when it expires next week  it’s a criminal offence to process personal data as a data controller without a registration, unless you have an exemption).

But, more directly, he is a data controller because as an MP he is a person who determines the purposes for which and the manner in which the personal data of his constituents is processed.  Sensible guidance for MPs is provided by Parliament itself

A Member is the data controller for all personal data that is handled by their office and they have overall responsibility for ensuring that this is done in accordance with the DPA.

I have already written recently raising some concerns about Carswell’s alleged handling of constituents’ personal data. But this week he decided to leave the Conservative Party, resign his seat, and seek re-election as a member of the UKIP party. James Forsyth, in the Daily Mail, talks about the constituency knowledge Carswell will bring to UKIP, and reports that “one senior Ukip figure purrs: ‘The quality of Douglas’s data is amazing'”.

As a data controller an MP must process constituents’ personal data in accordance with the eight data protection principles of the Data Protection Act 1998 (DPA). Failure to do so is a contravention of the data controller’s obligation under section 4(4). Data subjects can bring legal claims for compensation for contravention of that obligation, and for serious contraventions the ICO can take enforcement action, including the serving of monetary penalty notices to a maximum of £500,000.

The second data protection principle requires that

Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes

A person’s political opinions are “sensitive personal data”, afforded even greater protection under the DPA. It is not difficult to understand the historical basis for this, nor, indeed, the current basis for its still being so. Data protection law is in part an expression of and development of rights which were recognised by the drafters of the Universal Declaration of Human Rights and European Convention on Human Rights. Oppression of people on the basis of their politics was and remains distressingly common.

If constituents have given Carswell their details on the basis that it would be processed as part of his constituency work as a Conservative MP they might rightly be aggrieved if that personal data were then used by him in pursuit of his campaign as a UKIP candidate. As Paul Bernal tweeted

If I gave my data to help the Tories and found it was being used to help UKIP I’d be livid
Such use would also potentially be in breach of the first data protection principle, which requires that personal data be processed fairly and lawfully. It would not be fair to share data with a political party or for the purposes of furthering its aim in circumstances where the data subject was not aware of this, and might very reasonably object. And it would not be lawful if the data were, for instance, disclosed to UKIP in breach of confidence.

An interesting twitter discussion took place this morning about whether this apparent use of constituents’ data might even engage the criminal law provisions of the DPA. As well as Carswell, there may be other data controllers involved: if some of the data he was in possession of was for instance, being processed by him on behalf of, say, the Conservative Party itself, then the latter would be data controller. Section 55 of the DPA creates, in terms, an offence of unlawfully disclosing personal data without the consent of the data controller. However, as was agreed on twitter, this would be a complex knot to unpick, and it is unlikely, to say the least, that either the ICO or the CPS would want to pursue the matter.
Notwithstanding this, there are serious questions to be asked about the DPA implications of any MP crossing the floor. The use of personal data is likely to be a key battleground in the forthcoming general election, and throw even sharper focus on European data protection reform. I would argue that this is a subject which the ICO needs to get a grip on, and quickly.

 

UPDATE: Paul Bernal has written a superb piece on the broader ethical issues engaged here.

4 Comments

Filed under Confidentiality, Data Protection, human rights, Information Commissioner

Google is not a library, Dr Cavoukian

The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.

For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.

Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:

A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…

…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.

Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court

(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).

I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say

A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.

Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).

I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)

Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.

The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?

 

 

2 Comments

Filed under Data Protection, Directive 95/46/EC, Europe, human rights, Privacy

A public interest test in the Data Protection Act?

Mr Justice Cranston has suggested that there is a public interest factor when considering whether disclosure of personal data would be “fair” processing. I’m not sure that is right.

The first data protection principle (DPP1) in Schedule 1 of the Data Protection Act 1998 (DPA) says that personal data must be processed “fairly” (and lawfully). But what does “fairly” mean?

In an interesting recent case (AB v A Chief Constable [2014] EWHC 1965 (QB)) the High Court determined that, on the very specific facts, it would not be fair, in terms of DPP1, and common law legitimate expectation, for a Chief Constable to send a second, non-standard, reference to the new employer of a senior police officer who was subject to disciplinary investigation. (The judgment merits close reading – this was by no means a statement of general principle about police references). The reason it would not be fair was because the officer in question had tendered his resignation upon the sending of the initial, anodyne, reference, and the force had terminated misconduct proceedings:

He was thus in the position that for the Force to send the second reference would most likely leave him without employment and without the opportunity to refute the gross misconduct allegations. In these special circumstances it would be a breach of the Data Protection Act 1998 and undermine his legitimate expectations for the second reference to be sent [¶94]

Something in particular struck me about the judge’s analysis of DPP1, although, given the outcome, it was not determinative. He rejected a submission from the claimant officer that the duty of fairness in the DPP1 and the European Data Protection Directive was a duty to be fair primarily to the data subject. Rather, correctly identifying that the privacy rights in the Directive and the DPA are grounded in article 8 of the European Convention on Human Rights and in general principles of EU law, he held that

The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I am not sure this is right. Recital 28 of the Directive says

Whereas any processing of personal data must be lawful and fair to the individuals concerned [emphasis added]

and recital 38 suggests that whether processing is “fair” is in large part dependent on whether the data subject is made aware of the processing and the circumstances under which it takes place. These recitals give way to the descriptions in Articles 10 and 11 which both talk about “fair processing in respect of the data subject” (again, emphasis added). Similarly Part II of Schedule One to the DPA provides interpretation to DPP1, and says that in determining whether personal data are processed fairly

regard is to be had to the method by which they are obtained, including in particular whether any person from whom they are obtained is deceived or misled as to the purpose or purposes for which they are to be processed

Admittedly this introduces “any person”, which could be someone other than the data subject, but more general considerations of public interest are absent. It is also notable that the Information Commissioner’s position in guidance seems predicated solely on the belief that it is the data subject’s interests that are engaged in an analysis of “fairness”, although the guidance does conceded that processing might cause some detriment to the individual without it being unfair, but I do not think this is the same as taking into account public interest in disclosure.

To the extent that a public interest test does manifest itself in DPP1, it is normally held to be in the conditions in Schedules 2 and 3. DPPP1 says that, in addition to the obligation to process personal data fairly and lawfully, a condition in Schedule 2 (and, for sensitive personal data, Schedule 3) must be met. Many of these conditions contain tests as to whether the processing is “necessary”, and that “necessity test” constitutes a proportionality test, as described by Latham LJ in Corporate Officer of the House of Commons v The Information Commissioner & Ors [2008] EWHC 1084 (Admin)

‘necessary’…should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends

To import a public interest test into the word “fairly” in DPP1 seems to me to be a potentially radical step, especially when disclosures of personal data under the Freedom of Information Act 2000 (FOIA) are being considered. As I say – I doubt that this is correct, but I would welcome any contrary (or concurring) opinions.

(By the way, I at first thought there was a more fundamental error in the judgment: the judge found that a rule of law was engaged which ordinarily would have required the Chief Constable to send the second reference:

the public law duty of honesty and integrity would ordinarily have demanded that the Chief Constable send the Regulatory Body something more than the anodyne reference about the claimant [¶93]

If a rule of law necessitates disclosure of personal data, then the exemption at section 35 DPA removes the requirement to process that data fairly and lawfully. However, I think the answer lies in the use of the word “ordinarily”: in this instance the doctrine of legitimate expectation (which the claimant could rely upon) meant that the public law duty to send the second reference didn’t apply. So section 35 DPA wasn’t engaged.)

 

 

 

 

 

7 Comments

Filed under Confidentiality, Data Protection, human rights, police