Category Archives: Data Protection

Police building register of domestic CCTV for crime investigation purposes?

This is a flyer apparently being distributed by Thames Valley Police (TVP).

flyer

It invites householders who have private CCTV systems to register with TVP, who want to use those systems “in order to assist us in future investigations”.

Surveillance camera footage can undoubtedly be of great use in the investigation and prosecution of crime. But there is a potential problem for householders who decided to register with TVP, and I’d be interested to know if the latter have taken this into account.

The problem is this: CCTV cameras involve the processing of data, and where they capture images of identifiable individuals, it is personal data that they are processing. Purely domestic processing of personal data is exempt from all of the obligations under the Data Protection Act 1998, but when the processing is no longer purely for domestic purposes, then legal obligations potentially attach themselves to those doing the processing. The Information Commissioner’s Office (ICO) CCTV Code of Practice (both the current 2008 version and an updated version currently in draft) explains

The use of cameras for limited household purposes is exempt from the DPA. This applies where an individual uses CCTV to protect their home from burglary, even if the camera overlooks the street or other areas near their home

But the corollary of this is that if its use is not purely for the “household purposes” of protecting one’s home from bulgary, then the exemption no longer applies. If householders are determining that the purpose for which they will process personal data is to assist TVP in criminal investigations, then they are data controllers.

This can’t simply be TVP wanting a register of CCTV-operating households to assist them if a crime happens on those specific premises, because that would be pointless: in those circumstances the householder would draw the footage to the police’s attention. No, this must be that TVP want to be able to access footage of relevant incidents outwith the individual household. 

I’ve asked TVP if they have any policy statement or guidelines on this initiative, and will update as and when they reply.

1 Comment

Filed under Data Protection, police, Privacy, surveillance, surveillance commissioner

Privacy issues with Labour Party website

Two days ago I wrote about a page on the Labour Party website which was getting considerable social media coverage. It encourages people to submit their date of birth to find out, approximately, of all the births under the NHS, what number they were.

I was concerned that it was grabbing email address without an opt-out option. Since then, I’ve been making a nuisance of myself asking, via twitter, various Labour politicians and activists for their comments. I know I’m an unimportant blogger, and it was the weekend, but only one chose to reply: councillor for Lewisham Mike Harris, who, as campaign director for DontSpyOnUs, I would expect to be concerned, and, indeed, to his credit, he said “You make a fair point, there should be the ability to opt out”. Mike suggested I email Labour’s compliance team.

In the interim I’d noticed that elsewhere on the Labour website there were other examples of emails being grabbed in circumstances where people would not be sure about the collection. For instance: this “calculator” which purports to calculate how much less people would pay under Labour for energy bills, which gives no privacy notice whatsoever. Or even this, on the home page, which similarly gives no information about what will happen with your data

homepage

Now, some might say that, if you’re giving your details to “get involved”, then you are consenting to further contact. This is probably true, but it doesn’t mean the practice is properly compliant with data collection laws. And this is not unimportant; as well as potentially contributing to the global spam problem, poor privacy notices/lack of opt-out facilities at the point of collection of email address contribute to the unnecessary amassing of private information, and when it is done by a political party, this can even be dangerous. It should not need pointing out that, historically, and elsewhere in the world, political party lists have often been used by opposition parties and repressive governments to target and oppress activists. Indeed, the presence of one’s email on a party marketing database might well constitute sensitive personal data – as it can be construed as information on one’s political opinions (per section 2 of the Data Protection Act 1998).

So, these are not unimportant issues, and I decided to follow Mike Harris’s suggestion to email Labour’s compliance unit. However, the contact details I found on the overarching privacy policy merely gave a postal address. I did notice though that that page said

If you have any questions about our privacy policy, the information we have collected from you online, the practices of this site or your interaction with this website, please contact us by clicking here

But if I follow the “clicking here” link, it takes me to – wait for it – a contact form which gives no information whatsoever about what will happen if I submit it, other than the rather stalinesque

The Labour Party may contact you using the information you supply

And returning to the overarching privacy policy didn’t assist here – none of the categories on that page fitted the circumstances of someone contacting the party to make a general enquiry.

I see that the mainstream media have been covering the NHS birth page which originally prompted me to look at this issue. Some, like the Metro, and unsurprisingly, the Mirror, are wholly uncritical. The Independent does note that it is a clever way of harvesting emails, but fails to note the questionable legality of the practice. Given that this means that more and more email addresses will be hoovered up, without people fully understanding why, and what will happen with them, I really think that senior party figures, and the Information Commissioner, should start looking at Labour’s online privacy activities.

(By the way, if anyone thinks this is a politically-motivated post by me, I would point out that, until 2010, when I voted tactically (never again), I had only ever voted for one party in my whole life, and that wasn’t the Conservatives or the Lib Dems.)

6 Comments

Filed under Data Protection, Information Commissioner, marketing, PECR, Privacy, privacy notice, social media, tracking

DVLA, disability and personal data

Is the DVLA’s online vehicle-checker risking the exposure of sensitive personal data of registered keepers of vehicles?

The concept of “personal data”, in the Data Protection Act 1998 (DPA) (and, beyond, in the European Data Protection Directive EC/95/46) can be a slippery one. In some cases, as the Court of Appeal recognised in Edem v The Information Commissioner & Anor [2014] EWCA Civ 92 where it had to untangle a mess that the First-tier tribunal had unnecessarily got itself into, it is straightforward: someone’s name is their personal data. In other cases, especially those which engage the second limb of the definition in section 1(1) of the DPA (“[can be identified] from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller” it can be profoundly complex (see the House of Lords in Common Services Agency v Scottish Information Commissioner (Scotland) [2008] UKHL 47, a judgment which, six years on, still makes data protection practitioners wake up in the night screaming).

When I first looked at the reports that the DVLA’s Vehicle Tax Check service enabled people to see whether the registered owner of a car was disabled, I thought this might fall into the complex category of data protection issues. On reflection, I think it’s relatively straightforward.

I adopt the excellent analysis by the benefitsandwork.co.uk site

A new vehicle check service on the DVLA website allows visitors to find out whether their neighbours are receiving the higher rate of the mobility component of disability living allowance (DLA) or either rate of the mobility component of personal independence payment (PIP)…The information that DVLA are making available is not about the vehicle itself. Instead they are publishing personal information about the benefits received by the individual who currently owns the car or for whom the car is solely used.

It’s difficult to argue against this, although it appears the DVLA are trying, because they responded to the initial post by saying

The Vehicle Enquiry Service does not include any personal data. It allows people to check online what information DVLA holds about a vehicle, including details of the vehicle’s tax class to make sure that local authorities and parking companies do not inadvertently issue parking penalties where parking concessions apply. There is no data breach – the information on a vehicle’s tax class that is displayed on the Vehicle Enquiry Service does not constitute personal data. It is merely a descriptive word for a tax class

but, as benefitsandwork say, that is only true insofar as the DVLA are publishing the tax band of the car, but when they are publishing that the car belongs to a tax-exempt category for reasons of the owner’s disability, they are publishing something about the registered keeper (or someone they care for, or regularly drive), and that is sensitive personal data.

What DVLA is doing is not publishing the car’s tax class – that remains the same whoever the owner is – they are publishing details of the exempt status of the individual who currently owns it. That is personal data about the individual, not data about the vehicle

As the Information Commissioner’s guidance (commended by Moses LJ in Edem) says

Is the data being processed, or could it easily be processed, to: learn; record; or decide something about an identifiable individual, or; as an incidental consequence of the processing, either: could you learn or record something about an identifiable individual; or could the processing have an impact on, or affect, an identifiable individual

Ultimately benefitsandwork’s example (where someone was identified from this information) unavoidably shows that the information can be personal data: if someone can search the registration number of a neighbour’s car, and find out that the registered keeper is exempt from paying the road fund licence for reasons of disability, that information will be the neighbour’s personal data, and it will have been disclosed to them unfairly, and in breach of the DPA (because no condition for the disclosure in Schedule 3 exists).

I hope the DVLA will rethink.

 

11 Comments

Filed under Confidentiality, Data Protection, Directive 95/46/EC, disability, Information Commissioner, Privacy

Labour Party website – unfair processing?

Earlier this year I wrote about a questionable survey on the Conservative Party website, which failed to comply with the legal requirements regarding capture of email addresses. It is perhaps unsurprising to see something similar now being done in the name of the Labour Party.

An innocuous looking form on Labour’s donation pages lies underneath a statement that almost 44 million babies have been delivered under NHS care since 1948. The form invites people to find out what number their birth was. There are of course lots of this type of thing on the internet: “What was number one when you were born?” “Find out which Banana Split you are” etc. But this one, as well as asking for people’s date of birth, asks for their (first) name, email address and postcode. And, sure enough, underneath, in small print that I suspect they hope people won’t read, it says

The Labour Party and its elected representatives may contact you about issues we think you may be interested in or with campaign updates. You may unsubscribe at any point

So, they’ll have your email address, your first name and a good idea of where you live (cue lots of “Hi Jon” emails, telling me about great initiatives in my area). All very predictable and dispiriting. And also almost certainly unlawful: regulation 22(2) of The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) says that

a person shall neither transmit, nor instigate the transmission of, unsolicited communications for the purposes of direct marketing by means of electronic mail unless the recipient of the electronic mail has previously notified the sender that he consents for the time being to such communications being sent by, or at the instigation of, the sender

This Labour web page impermissibly infers consent. The European Directive  to which PECR give domestic effect makes clear in recital 40 that electronic marketing requires that prior, explicit consent  be obtained. Furthermore the Information Commissioner’s Office (ICO), issues clear guidance on PECR and marketing, and this says

Organisations must give the customer the chance to opt out – both when they first collect the details, and in every email or text. Organisations should not assume that all customers will be happy to get marketing texts or emails in future…It must be simple to opt out. When first collecting a customer’s details, this should be part of the same process (eg online forms should include a prominent opt-out box…

The ICO’s guidance on political campaigning is (given the likelihood of abuse) disappointingly less clear, but it does say that “An organisation must have the individual’s consent to communicate with them [by email]”. I rather suspect the Labour Party would try to claim that the small print would suffice to meet this consent point, but a) it wouldn’t get them past the hurdle of giving the option to opt out at the point of collection of data, and b) in the circumstances it would crash them into the hurdle of “fairness”. The political campaigning guidance gives prominence to this concept

It is not just in an organisation’s interests to act lawfully, but it should also have respect for the privacy of the individuals it seeks to represent by treating them fairly. Treating individuals fairly includes using their information only in a way they would expect

I do not think the majority of people completing the Labour Party’s form, which on the face of it simply returns a number relating to when they were born, would expect their information to be used for future political campaigning. So it appears to be in breach of PECR, not fair, and also, of course (by reference to the first principle in Schedule One) in breach of the Data Protection Act 1998. Maybe the ICO will want to take a look.

UPDATE:

I see that this page is being pushed quite hard by the party. Iain McNicol, General Secretary, and described as “promoter” of the page has tweeted about it, as have shadow Health Secretary Andy Burnham and Ed Miliband himself. One wonders how many email addresses have been gathered in this unfair and potentially unlawful way.

 

3 Comments

Filed under consent, Data Protection, Information Commissioner, marketing, PECR

We’re looking into it

The news is awash with reports that the UK Information Commissioner’s Office (ICO) is “opening an investigation” into Facebook’s rather creepy research experiment, in conjunction with US universities, in which it apparently altered the users’ news feeds to elicit either positive or negative emotional responses. Thus, the BBC says “Facebook faces UK probe over emotion study”, SC Magazine says “ICO probes Facebook data privacy” and the Financial Times says “UK data regulator probes Facebook over psychological experiment”.

As well as prompting one to question some journalists’ obsession with probes, this also leads one to look at the basis for these stories. It appears to lie in a quote from an ICO spokesman, given I think originally to the online IT news outlet The Register

The Register asked the office of the UK’s Information Commissioner if it planned to probe Facebook following widespread criticism of its motives.

“We’re aware of this issue, and will be speaking to Facebook, as well as liaising with the Irish data protection authority, to learn more about the circumstances,” a spokesman told us.
So, the ICO is aware of the issue and will be speaking to Facebook and to the Irish Data Protection Commissioner’s office. This doesn’t quite match up to the rather hyperbolic news headlines. And there’s a good reason for this – the ICO is highly unlikely to have any power to investigate, let alone take action. Facebook, along with many other tech/social media companies, has its non-US headquarters in Ireland. This is partly for taxation reasons and partly because of access to high-skilled, relatively low cost labour. However, some companies – Facebook is one, LinkedIn another – have another reason, evidenced by the legal agreements that users enter into: because the agreement is with “Facebook Ireland”, then Ireland is deemed to be the relevant jurisdiction for data protection purposes. And, fairly or not, the Irish data protection regime is generally perceived to be relatively “friendly” towards business.
 
These jurisdictional issues are by no means clear cut – in 2013  a German data protection authority tried to exercise powers to stop Facebook imposing a “real name only” policy.
 
Furthermore, as the Court of Justice of the European Union recognised in the recent Google Spain case, the issue of territorial responsibilities and jurisdiction can be highly complex. The Court held there that, as Google had
 
[set] up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State
 
it was processing personal data in that Member State (Spain). Facebook does have a large UK corporate office with some responsibility for sales. It is just possible that this could give the ICO, as domestic data protection authority, some power to investigate. And if or when the draft European General Data Protection Regulation gets passed, fundamental shifts could take place, extending even, under Article 3(2) to bringing data controllers outside the EU within jurisdiction, where they are offering goods or services to (or monitoring) data subjects in the EU.
 
But the question here is really whether the ICO will assert any purported power to investigate, when the Irish DPC is much more clearly placed to do so (albeit it with terribly limited resources). I think it’s highly unlikely, despite all the media reports. In fact, if the ICO does investigate, and it leads to any sort of enforcement action, I will eat my hat*.
 
*I reserve the right to specify what sort of hat

Leave a comment

Filed under Data Protection, Directive 95/46/EC, enforcement, facebook, journalism, social media, Uncategorized

A green light for publishing FOI requesters names? I hope not

The Information Commissioner’s Office (ICO) today issued a statement about the data protection implications of public authorities publishing the names of people who have made requests under the Freedom of Information Act 2000 (FOIA). It was issued to journalist Jules Mattsson (it may have been issued to others) and I credit him for pursuing it. It arose out of concerns expressed on Twitter yesterday that a council had uploaded a disclosure log in which the names of requesters were unredacted*.

When the Justice Committee undertook its post-legislative scrutiny of FOIA in 2012 it made a recommendation (¶82) that names of requesters be published in disclosure logs

it can be argued that someone seeking to exercise freedom of information rights should be willing for the fact they have requested such information to be in the public domain; we therefore recommend that where the information released from FOI requests is published in a disclosure log, the name of the requestor should be published alongside it

But this was rejected by the government in its response to the report (¶25)

The Government does not share the view that publishing the names of requesters in disclosure logs would be beneficial in terms of burdens. Such a move would have implications for the data protection of requesters..

 Tim Turner blogged in his usual meticulous style on these data protection implications yesterday, and I am not going to rehearse the points he makes. Indeed, the ICO in its statement more or less agrees with Tim’s comments on fairness, and necessity, when it comes to the publication of requesters’ names

Individuals who make…requests must have their details handled fairly. Many people who have made a request would not expect to have their name linked to published details of the request they have made. If a public authority is considering publishing this information then they must consider why publishing the requester’s name is necessary/ While there is a need for authorities to be transparent about the [FOI] process, in most cases this would not extend to releasing people’s name simply to deter requesters

There then follow some (correct) observations that journalists and politicians might have different expectations, before the statement says

At the very least people should be told that their details will be published and given the opportunity to explain to the council why their name should not be disclosed. If having raised it with the authority a person is not happy with the way their details have been handled then we may be able to help

So what the ICO appears to be doing is agreeing that there are data protection implications, but, as long as authorities give requesters a privacy notice, announcing that they’re not going to do anything (unless people complain). It’s not often I take issue with the excellent Matt Burgess, who runs FOI Directory, but he claims that “the ICO has criticised the Council”. With respect, I don’t see any targeted criticism in the ICO’s statement, and I fear some public authorities will see it as a green light to publishing names.

As source does inform me that an ICO spokesman has said that they are going to be in touch with the council in question, to find out the full details. However, I wonder if the statement shows an approach more in line with the ICO’s new, largely reactive (as opposed to proactive), approach to data protection concerns (described on my blog by Dr David Erdos as having worrying implications for the rule of law), but I fear it risks the exposure of the personal data of large numbers of people exercising their right to information under a statutory scheme which, at heart, is meant to be applicant-blind. As the ICO implies, this could have the effect of deterring some requesters, and this would be, in the words of the always perceptive Rich Greenhill, a type of reverse chilling effect for FOIA.

 *I’m not going to link to the information: I don’t think its publication is fair. 

 

 

UPDATE: 05.07.14

The Council appears to have taken the information down, with Jules Mattsson reporting on 3 July that they are reviewing the publication of requesters’ names.

6 Comments

Filed under Data Protection, Freedom of Information, Information Commissioner

I DON’T KNOW WHAT I’M DOING

As surprising as it always is to me, I’m occasionally reminded that I don’t know everything. But when I’m shown not to know how my own website works, it’s more humbling.

A commenter on one of my blog posts recently pointed out the number of tracking applications which were in operation. I had no idea. (I’ve disabled (most of) them now).

And someone has just pointed out (and some others have confirmed) that, when visiting my blog on their iphone, it asks them whether they want to tell me their current location. I have no idea why. (I’m looking into it).

These two incidents illustrate a few things to me.

Firstly, for all my pontificating about data protection, and – sometimes – information security, I’m not particularly technically literate: this is a wordpress.com blog, which is the off-the-peg version, with lots of things embedded/enabled by default. Ideally, I would run and host my own site, but I do this entirely in my own time, with no funding at all.

Secondly, and following on from the first,  I am one among billions of people who run web applications without knowing a great deal about the code that they’re based on. In a world of (possibly deliberately coded) back-door and zero day vulnerabilities this isn’t that surprising. If even experts can be duped, what hope for the rest of us?

Thirdly, and more prosaically, I had naively assumed that, in inviting people to read and interact with my blog, I was doing so in a capacity of data controller: determining the purposes for which and the manner in which their personal data was to be processed. (I had even considered notifying the processing with the Information Commissioner, although I know that they would (wrongly) consider I was exempt under section 36 of the Data Protection Act 1998)). But if I don’t even know what my site is doing, in what way can I be said to determine the data processing purposes and manner? But if I can’t, then should I stop doing it? I don’t like to be nominally responsible for activities I can’t control.

Fourthly, and finally, can anyone tell me why my out-of-control blog is asking users to give me their location, and how I can turn the damned thing off?

UPDATE: 30.06.14

The consensus from lots and lots of helpful and much-appreciated comments seems to be a) that this location thingy is embedded in the wordpress software (maybe the theme software), and b) I should migrate to self-hosting.

The latter option sounds good, but I have to remind people that I DON’T KNOW WHAT I’M DOING.

UPDATE:05.07.14

The rather excellent Rich Greenhill seems to have identified the problem (I trust his judgement, but haven’t confirmed this). He says “WordPress inserts mobile-only getCurrentPosition from aka-cdn-nsDOTadtechusDOTcom/…DAC.js via adsDOTmopubDOTcom in WP ad script”…”Basically, WordPress inserts ads; but, for mobile devices only, the imported ad code also attempts to detect geo coordinates”.

So it dooes look like I, and other wordpress.com bloggers, who can’t afford the “no ads” option, are stuck with this unless or until we can migrate away.

UPDATE: 11.07.14

We are informed that the code which asks (some) mobile users for their location when browsing this blog has now been corrected. Please let me know if it isn’t.

3 Comments

Filed under Data Protection, Information Commissioner, Personal, social media, tracking

Google is not a library, Dr Cavoukian

The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.

For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.

Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:

A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…

…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.

Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court

(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).

I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say

A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.

Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).

I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)

Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.

The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?

 

 

2 Comments

Filed under Data Protection, Directive 95/46/EC, Europe, human rights, Privacy

The Partridge Review reveals apparently huge data protection breaches

Does the Partridge Review of NHS transfers of hospital episode patient data point towards one of the biggest DPA breaches ever?

In February this year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

When pressed by medConfidential‘s Phil Booth about this, and about risks of reidentification from the datasets, Tim repeated that no patient’s privacy had been compromised.

Some of us doubted this, as news of specific incidents of data loss emerged, and even more so as further news emerged suggesting that there had been transfers (a.k.a. sale) of huge amounts of potentially identifiable patient data to, for instance, the Institute and Faculty of Actuaries. The latter news led me to ask the Information Commissioner’s Office (ICO) to assess the lawfulness of this processing, an assessment which has not been completed four months later.

However, with the publication on 17 June of Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre one questions the basis for Tim’s assertions. Sir Nick commissioned PwC to analyse a total of 3059 data releases between 2005 and 2013 (when the NHS Information Centre (NHSIC) ceased to exist, and was replaced by the Health and Social Care Information Centre HSCIC). The summary report to the Review says that

It disappoints me to report that the review has discovered lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

and it reveals a series of concerning and serious failures of data governance, including

  • lack of detailed records between 1 April 2005 and 31 March 2009
  • two cases of data that was apparently released without a proper record remaining of which organisation received the data
  • [no] evidence that Northgate [the NHSIC contractor responsible for releases] got permission from the NHS IC before making releases as it was supposed to do
  • PwC could not find records to confirm full compliance in about 10% of the sample

 Sir Nick observes that

 the system did not have the checks and balances needed to ensure that the appropriate authority was always in place before data was released. In many cases the decision making process was unclear and the records of decisions are incomplete.

and crucially

It also seems clear that the responsibilities of becoming a data controller, something that happens as soon as an organisation receives data under a data sharing agreement, were not always clear to those who received data. The importance of data controllers understanding their responsibilities remains vital to the protection of people’s confidentiality

(This resonates with my concern, in my request to the ICO to assess the transfer of data from HES to the actuarial society, about what the legal basis was for the latter’s processing).

Notably, Sir Nick dispenses with the idea that data such as HES was anonymised:

The data provided to these other organisations under data sharing agreements is not anonymised. Although names and addresses are normally removed, it is possible that the identity of individuals may be deduced if the data is linked to other data

 And if it was not anonymised, then the Data Protection Act 1998 (DPA) is engaged.

All of this indicates a failure to take appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data, which the perspicacious among you will identify as one of the key statutory obligations placed on data controllers by the seventh data protection principle in the DPA.

Sir Nick may say

 It is a matter of fact that no individual ever complained that their confidentiality had been breached as a result of data being shared or lost by the NHS IC

but simply because no complaint was made (at the time – complaints certainly have been made since concerns started to be raised) does not mean that the seventh principle was not contravened, in a serious way.  And a serious contravention of the DPA of a kind likely to cause substantial damage or substantial distress can potentially lead to the ICO serving a monetary penalty notice (MPN) to a maximum of £500,000 (at least for contraventions after April 2010, when the ICO’s powers commenced).

The NHSIC is no more (although as Sir Nick says, HSCIC “inherited many of the NHS IC’s staff and procedures”). But that has not stopped the ICO serving MPNs on successor organisation in circumstances where their predecessors committed the contravention.  One waits with interest to see whether the ICO will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data. I would suggest it is imperative that HSCIC know that their processing of personal data is now subject to close oversight by all relevant regulatory bodies.

 

 

 

 

 

 

 

 

 

2 Comments

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, monetary penalty notice, NHS, Privacy

A public interest test in the Data Protection Act?

Mr Justice Cranston has suggested that there is a public interest factor when considering whether disclosure of personal data would be “fair” processing. I’m not sure that is right.

The first data protection principle (DPP1) in Schedule 1 of the Data Protection Act 1998 (DPA) says that personal data must be processed “fairly” (and lawfully). But what does “fairly” mean?

In an interesting recent case (AB v A Chief Constable [2014] EWHC 1965 (QB)) the High Court determined that, on the very specific facts, it would not be fair, in terms of DPP1, and common law legitimate expectation, for a Chief Constable to send a second, non-standard, reference to the new employer of a senior police officer who was subject to disciplinary investigation. (The judgment merits close reading – this was by no means a statement of general principle about police references). The reason it would not be fair was because the officer in question had tendered his resignation upon the sending of the initial, anodyne, reference, and the force had terminated misconduct proceedings:

He was thus in the position that for the Force to send the second reference would most likely leave him without employment and without the opportunity to refute the gross misconduct allegations. In these special circumstances it would be a breach of the Data Protection Act 1998 and undermine his legitimate expectations for the second reference to be sent [¶94]

Something in particular struck me about the judge’s analysis of DPP1, although, given the outcome, it was not determinative. He rejected a submission from the claimant officer that the duty of fairness in the DPP1 and the European Data Protection Directive was a duty to be fair primarily to the data subject. Rather, correctly identifying that the privacy rights in the Directive and the DPA are grounded in article 8 of the European Convention on Human Rights and in general principles of EU law, he held that

The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I am not sure this is right. Recital 28 of the Directive says

Whereas any processing of personal data must be lawful and fair to the individuals concerned [emphasis added]

and recital 38 suggests that whether processing is “fair” is in large part dependent on whether the data subject is made aware of the processing and the circumstances under which it takes place. These recitals give way to the descriptions in Articles 10 and 11 which both talk about “fair processing in respect of the data subject” (again, emphasis added). Similarly Part II of Schedule One to the DPA provides interpretation to DPP1, and says that in determining whether personal data are processed fairly

regard is to be had to the method by which they are obtained, including in particular whether any person from whom they are obtained is deceived or misled as to the purpose or purposes for which they are to be processed

Admittedly this introduces “any person”, which could be someone other than the data subject, but more general considerations of public interest are absent. It is also notable that the Information Commissioner’s position in guidance seems predicated solely on the belief that it is the data subject’s interests that are engaged in an analysis of “fairness”, although the guidance does conceded that processing might cause some detriment to the individual without it being unfair, but I do not think this is the same as taking into account public interest in disclosure.

To the extent that a public interest test does manifest itself in DPP1, it is normally held to be in the conditions in Schedules 2 and 3. DPPP1 says that, in addition to the obligation to process personal data fairly and lawfully, a condition in Schedule 2 (and, for sensitive personal data, Schedule 3) must be met. Many of these conditions contain tests as to whether the processing is “necessary”, and that “necessity test” constitutes a proportionality test, as described by Latham LJ in Corporate Officer of the House of Commons v The Information Commissioner & Ors [2008] EWHC 1084 (Admin)

‘necessary’…should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends

To import a public interest test into the word “fairly” in DPP1 seems to me to be a potentially radical step, especially when disclosures of personal data under the Freedom of Information Act 2000 (FOIA) are being considered. As I say – I doubt that this is correct, but I would welcome any contrary (or concurring) opinions.

(By the way, I at first thought there was a more fundamental error in the judgment: the judge found that a rule of law was engaged which ordinarily would have required the Chief Constable to send the second reference:

the public law duty of honesty and integrity would ordinarily have demanded that the Chief Constable send the Regulatory Body something more than the anodyne reference about the claimant [¶93]

If a rule of law necessitates disclosure of personal data, then the exemption at section 35 DPA removes the requirement to process that data fairly and lawfully. However, I think the answer lies in the use of the word “ordinarily”: in this instance the doctrine of legitimate expectation (which the claimant could rely upon) meant that the public law duty to send the second reference didn’t apply. So section 35 DPA wasn’t engaged.)

 

 

 

 

 

7 Comments

Filed under Confidentiality, Data Protection, human rights, police