Category Archives: Confidentiality

Praise where it’s due, but the senior people aren’t listening

A few months ago I had to attend a clinic at a large hospital (nothing embarrassing, nothing serious, but I’m not going to disclose my sensitive personal data). Said hospital is, as are so many these days, crumbling under a lack of resources. In the past I’ve been to other clinics at the same hospital and been concerned to note that they are often run from areas that are little better than corridors, with no real physical data security measures in place – files left out on tables, computer screens open to view by bystanders etc.

However, on this occasion as I approached the healthcare assistant – let’s call her “Anne” – who appeared to be running the clinic (sure enough effectively in a corridor), I notice she kept the clinic list carefully shielded from my eyes, and when I gave my name she retrieved my file from a row of all the others hidden under a long strip of blue hospital paper (you know, the stuff on big rolls like kitchen towels).

I said how impressed I was at her simple but effective attempt to protect patient confidentiality under difficult circumstances, and said I was chairman of NADPO so knew a bit whereof I spoke. A little bit later Anne called me from my seat and I thought it was to take me to my appointment. However, she took me to her manager, and they explained that Anne had previously been criticised by one of the clinic consultants, who felt the blue paper was inconveniencing him, and who would at times remove it and throw it away.

So, I thought I’d write a letter – to the Chief Executive of the NHS Trust, copied to its Medical Records Manager, and Anne herself – praising her actions.

I completely forgot about it but yesterday out of nowhere received a card. It was from Anne saying that she’d received my copy letter, although she hadn’t heard from anyone else (not the Chief Executive nor the Medical Records Manager). She said that the letter was the nicest thing that had happened to her at work in 16 years.

I think this illustrates several things: 1) the NHS, and the public sector in general, are overstretched and confidentiality is potentially compromised as a result, 2) even in times of austerity low-cost information security measures can be effectively implemented, 3) sometimes people lower down are frustrated by, or even undermined by, those above them, 4) compliments are enormously valuable, and too rarely offered.

But there’s one final point. Anne had said in her card to me “I hope [the Chief Executive] wrote and thanked you”. Well no, she didn’t. And nor did the Medical Records Manager nor anyone else in the Hospital Trust. Only Anne had the courtesy to do so, and she was not the one who the message needed to get through to. I’d like to name (and slightly shame) the Trust, but I’d then identify “Anne”, and I don’t want to do that.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Confidentiality, Data Protection, NHS

Brooks Newmark, the press, and “the other woman”

UPDATE: 30.09.14 Sunday Mirror editor Lloyd Embley is reported by the BBC and other media outlets to have apologised for the use of women’s photos (it transpires that two women’s images appropriated), saying

We thought that pictures used by the investigation were posed by models, but we now know that some real pictures were used. At no point has the Sunday Mirror published any of these images, but we would like to apologise to the women involved for their use in the investigation

What I think is interesting here is the implicit admission that (consenting) models could have been used in the fake profiles. Does this mean therefore, the processing of the (non-consenting) women’s personal data was not done in the reasonable belief that it was in the public interest?

Finally, I think it’s pretty shoddy that former Culture Secretary Maria Miller resorts to victim-blaming, and missing the point, when she is reported to have said that the story “showed why people had to be very careful about the sorts of images they took of themselves and put on the internet”

END UPDATE.

With most sex scandals involving politicians, there is “the other person”. For every Profumo, a Keeler;  for every Mellor, a de Sancha; for every Clinton, a Lewinsky. More often than not the rights and dignity of these others are trampled in the rush to revel in outrage at the politicians’ behaviour. But in the latest, rather tedious, such scandal, the person whose rights have been trampled was not even “the other person”, because there was no other person. Rather, it was a Swedish woman* whose image was appropriated by a journalist without her permission or even her knowledge. This raises the question of whether such use, by the journalist, and the Sunday Mirror, which ran the exposé, was in accordance with their obligations under data protection and other privacy laws.

The story run by the Sunday Mirror told of how a freelance journalist set up a fake social media profile, purportedly of a young PR girl called Sophie with a rather implausible interest in middle-aged Tory MPs. He apparently managed to snare the Minister for Civil Society and married father of five, Brooks Newmark, and encourage him into sending explicit photographs of himself. The result was that the newspaper got a lurid scoop, and the Minister subsequently resigned. Questions are being asked about the ethics of the journalism involved, and there are suggestions that this could be the first difficult test for IPSO, the new Independent Press Standards Organisation.

But for me much the most unpleasant part of this unpleasant story was that the journalist appears to have decided to attach to the fake twitter profile the image of a Swedish woman. It’s not clear where he got this from, but it is understood that the same image had apparently already appeared on several fake Facebook accounts (it is not suggested, I think, that the same journalist was responsible for those accounts). The woman is reported to be distressed at the appropriation:

It feels really unpleasant…I have received lot of emails, text messages and phone calls from various countries on this today. It feels unreal…I do not want to be exploited in this way and someone has used my image like this feels really awful, both for me and the others involved in this. [Google translation of original Swedish]

Under European and domestic law the image of an identifiable individual is their personal data. Anyone “processing” such data as a data controller (“the person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”) has to do so in accordance with the law. Such processing as happened here, both by the freelance journalist, when setting up and operating the social media account(s), and by the Sunday Mirror, in publishing the story, is covered by the UK Data Protection Act 1998 (DPA). This will be the case even though the person whose image was appropriated is in Sweden. The DPA requires, among other things, that processing of personal data be “fair and lawful”. It affords aggrieved individuals the right to bring civil claims for compensation for damage and distress arising from contraventions of data controllers’ obligations under the DPA. It also affords them the right to ask the Information Commissioner’s Office (ICO) for an assessment of the likelihood (or not) that processing was in compliance with the DPA.

However, section 32 of the DPA also gives journalism a very broad exemption from almost all of the Act, if the processing is undertaken with a view to publication, and the data controller reasonably believes that publication would be in the public interest and that compliance with the DPA would be incompatible with the purposes of journalism. As the ICO says

The scope of the exemption is very broad. It can disapply almost all of the DPA’s provisions, and gives the media a significant leeway to decide for themselves what is in the public interest

The two data controllers here (the freelancer and the paper) would presumably have little problem satisfying a court, or the ICO, that when it came to processing of Brooks Newmark’s personal data, they acted in the reasonable belief that the public interest justified the processing. But one wonders to what extent they even considered the processing of (and associated intrusion into the private life of) the Swedish woman whose image was appropriated. Supposing they didn’t even consider this processing – could they reasonably say they that they reasonably believed it to have been in the public interest?

These are complex questions, and the breadth and ambit of the section 32 exemption are likely to be tested in litigation between the mining and minerals company BSG and the campaigning group Global Witness (currently stalled/being considered at the ICO). But even if a claim or complaint under DPA would be a tricky one to make, there are other legal issues raised. Perhaps in part because of the breadth of the section 32 DPA exemption (and perhaps because of the low chance of significant damages under the DPA), claims of press intrusion into private lives are more commonly brought under the cause of action of “misuse of private information “, confirmed – it would seem – as a tort, in the ruling of Mr Justice Tugendhat in Vidal Hall and Ors v Google Inc [2014] EWHC 13 (QB), earlier this year. Damage awards for successful claims in misuse of private information have been known to be in the tens of thousands of pounds – most notably recently an award of £10,000 for Paul Weller’s children, after photographs taken covertly and without consent had been published in the Mail Online.

IPSO expects journalists to abide by the Editor’s Code, Clause 3 of which says

i) Everyone is entitled to respect for his or her private and family life, home, health and correspondence, including digital communications.

ii) Editors will be expected to justify intrusions into any individual’s private life without consent. Account will be taken of the complainant’s own public disclosures of information

and the ICO will take this Code into account when considering complaints about journalistic processing of personal data. One notes that “account will be taken of the complainant’s own public disclosures of information”, but one hopes that this would not be seen to justify the unfair and unethical appropriation of images found elsewhere on the internet.

*I’ve deliberately, although rather pointlessly – given their proliferation in other media – avoided naming the woman in question, or posting her photograph

4 Comments

Filed under Confidentiality, consent, Data Protection, Information Commissioner, journalism, Privacy, social media

Data protection implications of MPs crossing the floor

Douglas Carswell MP is a data controller.

It says so on the Information Commissioner’s register:

carswell

(I hope he remembers to renew the registration when it expires next week  it’s a criminal offence to process personal data as a data controller without a registration, unless you have an exemption).

But, more directly, he is a data controller because as an MP he is a person who determines the purposes for which and the manner in which the personal data of his constituents is processed.  Sensible guidance for MPs is provided by Parliament itself

A Member is the data controller for all personal data that is handled by their office and they have overall responsibility for ensuring that this is done in accordance with the DPA.

I have already written recently raising some concerns about Carswell’s alleged handling of constituents’ personal data. But this week he decided to leave the Conservative Party, resign his seat, and seek re-election as a member of the UKIP party. James Forsyth, in the Daily Mail, talks about the constituency knowledge Carswell will bring to UKIP, and reports that “one senior Ukip figure purrs: ‘The quality of Douglas’s data is amazing'”.

As a data controller an MP must process constituents’ personal data in accordance with the eight data protection principles of the Data Protection Act 1998 (DPA). Failure to do so is a contravention of the data controller’s obligation under section 4(4). Data subjects can bring legal claims for compensation for contravention of that obligation, and for serious contraventions the ICO can take enforcement action, including the serving of monetary penalty notices to a maximum of £500,000.

The second data protection principle requires that

Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes

A person’s political opinions are “sensitive personal data”, afforded even greater protection under the DPA. It is not difficult to understand the historical basis for this, nor, indeed, the current basis for its still being so. Data protection law is in part an expression of and development of rights which were recognised by the drafters of the Universal Declaration of Human Rights and European Convention on Human Rights. Oppression of people on the basis of their politics was and remains distressingly common.

If constituents have given Carswell their details on the basis that it would be processed as part of his constituency work as a Conservative MP they might rightly be aggrieved if that personal data were then used by him in pursuit of his campaign as a UKIP candidate. As Paul Bernal tweeted

If I gave my data to help the Tories and found it was being used to help UKIP I’d be livid
Such use would also potentially be in breach of the first data protection principle, which requires that personal data be processed fairly and lawfully. It would not be fair to share data with a political party or for the purposes of furthering its aim in circumstances where the data subject was not aware of this, and might very reasonably object. And it would not be lawful if the data were, for instance, disclosed to UKIP in breach of confidence.

An interesting twitter discussion took place this morning about whether this apparent use of constituents’ data might even engage the criminal law provisions of the DPA. As well as Carswell, there may be other data controllers involved: if some of the data he was in possession of was for instance, being processed by him on behalf of, say, the Conservative Party itself, then the latter would be data controller. Section 55 of the DPA creates, in terms, an offence of unlawfully disclosing personal data without the consent of the data controller. However, as was agreed on twitter, this would be a complex knot to unpick, and it is unlikely, to say the least, that either the ICO or the CPS would want to pursue the matter.
Notwithstanding this, there are serious questions to be asked about the DPA implications of any MP crossing the floor. The use of personal data is likely to be a key battleground in the forthcoming general election, and throw even sharper focus on European data protection reform. I would argue that this is a subject which the ICO needs to get a grip on, and quickly.

 

UPDATE: Paul Bernal has written a superb piece on the broader ethical issues engaged here.

4 Comments

Filed under Confidentiality, Data Protection, human rights, Information Commissioner

Big Political Data

I’ve written over the past few months about questionable compliance by the Conservative, Labour, Liberal Democratic and Scottish National Parties with their obligations under the Data Protection Act 1998 and the Privacy and Electronic Communications (EC Directive) Regulations 2003. And, as I sat down to write this post, I thought I’d check a couple of other parties’ sites, and, sure enough, similar issues are raised by the UKIP and Plaid Cymru sites

ukipplaid

No one except a few enthusiasts in this area of law/compliance seems particularly concerned, and I will, no doubt, eventually get fed up with the dead horse I am flogging. However, a fascinating article in The Telegraph by James Kirkup casts a light on just why political parties might be so keen to harvest personal data, and not be transparent about their uses of it.

Kirkup points out how parties have begun an

extraordinarily extensive – and expensive – programme of opinion polls and focus groups generating huge volumes of data about voters’ views and preferences…Traditional polls and focus groups have changed little in the past two decades. They help parties discover what voters think, what they want to hear, and how best to say it to them. That is the first stage of campaigning. The second is to identify precisely which voters you need to speak to. With finite time and resources, parties cannot afford to waste effort either preaching to the converted or trying to win over diehard opponents who will never change sides. The party that finds the waverers in the middle gains a crucial advantage.

It seems clear to me that the tricks, and opacity, which are used to get people to give up their personal information, are part of this drive to amass more and more data for political purposes. It’s unethical, it’s probably unlawful, but few seem to care, and no one, including the Information Commissioner’s Office (which has, in the past taken robust action against dodgy marketing practices in party politics) has seemed prepared so far to do anything to prevent it. However, the ICO has good guidance for the parties on this, and in May this year, issued a warning to play by the marketing rules in the run-up to local and European elections. Let’s hope this warning, and the threat of enforcement action, extends to the bigger stage of the national elections next year.

 

 

 

 

2 Comments

Filed under Confidentiality, consent, Data Protection, Information Commissioner, marketing, PECR, Privacy

DVLA, disability and personal data

Is the DVLA’s online vehicle-checker risking the exposure of sensitive personal data of registered keepers of vehicles?

The concept of “personal data”, in the Data Protection Act 1998 (DPA) (and, beyond, in the European Data Protection Directive EC/95/46) can be a slippery one. In some cases, as the Court of Appeal recognised in Edem v The Information Commissioner & Anor [2014] EWCA Civ 92 where it had to untangle a mess that the First-tier tribunal had unnecessarily got itself into, it is straightforward: someone’s name is their personal data. In other cases, especially those which engage the second limb of the definition in section 1(1) of the DPA (“[can be identified] from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller” it can be profoundly complex (see the House of Lords in Common Services Agency v Scottish Information Commissioner (Scotland) [2008] UKHL 47, a judgment which, six years on, still makes data protection practitioners wake up in the night screaming).

When I first looked at the reports that the DVLA’s Vehicle Tax Check service enabled people to see whether the registered owner of a car was disabled, I thought this might fall into the complex category of data protection issues. On reflection, I think it’s relatively straightforward.

I adopt the excellent analysis by the benefitsandwork.co.uk site

A new vehicle check service on the DVLA website allows visitors to find out whether their neighbours are receiving the higher rate of the mobility component of disability living allowance (DLA) or either rate of the mobility component of personal independence payment (PIP)…The information that DVLA are making available is not about the vehicle itself. Instead they are publishing personal information about the benefits received by the individual who currently owns the car or for whom the car is solely used.

It’s difficult to argue against this, although it appears the DVLA are trying, because they responded to the initial post by saying

The Vehicle Enquiry Service does not include any personal data. It allows people to check online what information DVLA holds about a vehicle, including details of the vehicle’s tax class to make sure that local authorities and parking companies do not inadvertently issue parking penalties where parking concessions apply. There is no data breach – the information on a vehicle’s tax class that is displayed on the Vehicle Enquiry Service does not constitute personal data. It is merely a descriptive word for a tax class

but, as benefitsandwork say, that is only true insofar as the DVLA are publishing the tax band of the car, but when they are publishing that the car belongs to a tax-exempt category for reasons of the owner’s disability, they are publishing something about the registered keeper (or someone they care for, or regularly drive), and that is sensitive personal data.

What DVLA is doing is not publishing the car’s tax class – that remains the same whoever the owner is – they are publishing details of the exempt status of the individual who currently owns it. That is personal data about the individual, not data about the vehicle

As the Information Commissioner’s guidance (commended by Moses LJ in Edem) says

Is the data being processed, or could it easily be processed, to: learn; record; or decide something about an identifiable individual, or; as an incidental consequence of the processing, either: could you learn or record something about an identifiable individual; or could the processing have an impact on, or affect, an identifiable individual

Ultimately benefitsandwork’s example (where someone was identified from this information) unavoidably shows that the information can be personal data: if someone can search the registration number of a neighbour’s car, and find out that the registered keeper is exempt from paying the road fund licence for reasons of disability, that information will be the neighbour’s personal data, and it will have been disclosed to them unfairly, and in breach of the DPA (because no condition for the disclosure in Schedule 3 exists).

I hope the DVLA will rethink.

 

11 Comments

Filed under Confidentiality, Data Protection, Directive 95/46/EC, disability, Information Commissioner, Privacy

The Partridge Review reveals apparently huge data protection breaches

Does the Partridge Review of NHS transfers of hospital episode patient data point towards one of the biggest DPA breaches ever?

In February this year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

When pressed by medConfidential‘s Phil Booth about this, and about risks of reidentification from the datasets, Tim repeated that no patient’s privacy had been compromised.

Some of us doubted this, as news of specific incidents of data loss emerged, and even more so as further news emerged suggesting that there had been transfers (a.k.a. sale) of huge amounts of potentially identifiable patient data to, for instance, the Institute and Faculty of Actuaries. The latter news led me to ask the Information Commissioner’s Office (ICO) to assess the lawfulness of this processing, an assessment which has not been completed four months later.

However, with the publication on 17 June of Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre one questions the basis for Tim’s assertions. Sir Nick commissioned PwC to analyse a total of 3059 data releases between 2005 and 2013 (when the NHS Information Centre (NHSIC) ceased to exist, and was replaced by the Health and Social Care Information Centre HSCIC). The summary report to the Review says that

It disappoints me to report that the review has discovered lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

and it reveals a series of concerning and serious failures of data governance, including

  • lack of detailed records between 1 April 2005 and 31 March 2009
  • two cases of data that was apparently released without a proper record remaining of which organisation received the data
  • [no] evidence that Northgate [the NHSIC contractor responsible for releases] got permission from the NHS IC before making releases as it was supposed to do
  • PwC could not find records to confirm full compliance in about 10% of the sample

 Sir Nick observes that

 the system did not have the checks and balances needed to ensure that the appropriate authority was always in place before data was released. In many cases the decision making process was unclear and the records of decisions are incomplete.

and crucially

It also seems clear that the responsibilities of becoming a data controller, something that happens as soon as an organisation receives data under a data sharing agreement, were not always clear to those who received data. The importance of data controllers understanding their responsibilities remains vital to the protection of people’s confidentiality

(This resonates with my concern, in my request to the ICO to assess the transfer of data from HES to the actuarial society, about what the legal basis was for the latter’s processing).

Notably, Sir Nick dispenses with the idea that data such as HES was anonymised:

The data provided to these other organisations under data sharing agreements is not anonymised. Although names and addresses are normally removed, it is possible that the identity of individuals may be deduced if the data is linked to other data

 And if it was not anonymised, then the Data Protection Act 1998 (DPA) is engaged.

All of this indicates a failure to take appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data, which the perspicacious among you will identify as one of the key statutory obligations placed on data controllers by the seventh data protection principle in the DPA.

Sir Nick may say

 It is a matter of fact that no individual ever complained that their confidentiality had been breached as a result of data being shared or lost by the NHS IC

but simply because no complaint was made (at the time – complaints certainly have been made since concerns started to be raised) does not mean that the seventh principle was not contravened, in a serious way.  And a serious contravention of the DPA of a kind likely to cause substantial damage or substantial distress can potentially lead to the ICO serving a monetary penalty notice (MPN) to a maximum of £500,000 (at least for contraventions after April 2010, when the ICO’s powers commenced).

The NHSIC is no more (although as Sir Nick says, HSCIC “inherited many of the NHS IC’s staff and procedures”). But that has not stopped the ICO serving MPNs on successor organisation in circumstances where their predecessors committed the contravention.  One waits with interest to see whether the ICO will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data. I would suggest it is imperative that HSCIC know that their processing of personal data is now subject to close oversight by all relevant regulatory bodies.

 

 

 

 

 

 

 

 

 

2 Comments

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, monetary penalty notice, NHS, Privacy

A public interest test in the Data Protection Act?

Mr Justice Cranston has suggested that there is a public interest factor when considering whether disclosure of personal data would be “fair” processing. I’m not sure that is right.

The first data protection principle (DPP1) in Schedule 1 of the Data Protection Act 1998 (DPA) says that personal data must be processed “fairly” (and lawfully). But what does “fairly” mean?

In an interesting recent case (AB v A Chief Constable [2014] EWHC 1965 (QB)) the High Court determined that, on the very specific facts, it would not be fair, in terms of DPP1, and common law legitimate expectation, for a Chief Constable to send a second, non-standard, reference to the new employer of a senior police officer who was subject to disciplinary investigation. (The judgment merits close reading – this was by no means a statement of general principle about police references). The reason it would not be fair was because the officer in question had tendered his resignation upon the sending of the initial, anodyne, reference, and the force had terminated misconduct proceedings:

He was thus in the position that for the Force to send the second reference would most likely leave him without employment and without the opportunity to refute the gross misconduct allegations. In these special circumstances it would be a breach of the Data Protection Act 1998 and undermine his legitimate expectations for the second reference to be sent [¶94]

Something in particular struck me about the judge’s analysis of DPP1, although, given the outcome, it was not determinative. He rejected a submission from the claimant officer that the duty of fairness in the DPP1 and the European Data Protection Directive was a duty to be fair primarily to the data subject. Rather, correctly identifying that the privacy rights in the Directive and the DPA are grounded in article 8 of the European Convention on Human Rights and in general principles of EU law, he held that

The rights to private and family life in Article 8 are subject to the countervailing public interests set out in Article 8(2). So it is here: assessing fairness involves a balancing of the interests of the data subject in non-disclosure against the public interest in disclosure [¶75]

I am not sure this is right. Recital 28 of the Directive says

Whereas any processing of personal data must be lawful and fair to the individuals concerned [emphasis added]

and recital 38 suggests that whether processing is “fair” is in large part dependent on whether the data subject is made aware of the processing and the circumstances under which it takes place. These recitals give way to the descriptions in Articles 10 and 11 which both talk about “fair processing in respect of the data subject” (again, emphasis added). Similarly Part II of Schedule One to the DPA provides interpretation to DPP1, and says that in determining whether personal data are processed fairly

regard is to be had to the method by which they are obtained, including in particular whether any person from whom they are obtained is deceived or misled as to the purpose or purposes for which they are to be processed

Admittedly this introduces “any person”, which could be someone other than the data subject, but more general considerations of public interest are absent. It is also notable that the Information Commissioner’s position in guidance seems predicated solely on the belief that it is the data subject’s interests that are engaged in an analysis of “fairness”, although the guidance does conceded that processing might cause some detriment to the individual without it being unfair, but I do not think this is the same as taking into account public interest in disclosure.

To the extent that a public interest test does manifest itself in DPP1, it is normally held to be in the conditions in Schedules 2 and 3. DPPP1 says that, in addition to the obligation to process personal data fairly and lawfully, a condition in Schedule 2 (and, for sensitive personal data, Schedule 3) must be met. Many of these conditions contain tests as to whether the processing is “necessary”, and that “necessity test” constitutes a proportionality test, as described by Latham LJ in Corporate Officer of the House of Commons v The Information Commissioner & Ors [2008] EWHC 1084 (Admin)

‘necessary’…should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends

To import a public interest test into the word “fairly” in DPP1 seems to me to be a potentially radical step, especially when disclosures of personal data under the Freedom of Information Act 2000 (FOIA) are being considered. As I say – I doubt that this is correct, but I would welcome any contrary (or concurring) opinions.

(By the way, I at first thought there was a more fundamental error in the judgment: the judge found that a rule of law was engaged which ordinarily would have required the Chief Constable to send the second reference:

the public law duty of honesty and integrity would ordinarily have demanded that the Chief Constable send the Regulatory Body something more than the anodyne reference about the claimant [¶93]

If a rule of law necessitates disclosure of personal data, then the exemption at section 35 DPA removes the requirement to process that data fairly and lawfully. However, I think the answer lies in the use of the word “ordinarily”: in this instance the doctrine of legitimate expectation (which the claimant could rely upon) meant that the public law duty to send the second reference didn’t apply. So section 35 DPA wasn’t engaged.)

 

 

 

 

 

7 Comments

Filed under Confidentiality, Data Protection, human rights, police

Opting patients out of care.data – in breach of data protection law?

The ICO appear to think that GPs who opt patients out of care.data without informing them would be breaching the Data Protection Act.  They say it would be unfair processing

In February of this year GP Dr Gordon Gancz was threatened with termination of his contract, because he had indicated he would not allow his patients’ records to be uploaded to the national health database which as planned to be created under the care.data initiative. He was informed that if he didn’t remove information on his website, and if he went on to add “opt-out codes” to patients’ electronic records, he would be in breach of the NHS (GMS contract) Regulations 2004. Although this threatened action was later withdrawn, and care.data put on hold for six months, Dr Gancz might have been further concerned to hear that in the opinion of the Information Commissioner’s Office (ICO) he would also have been in breach of the Data Protection Act 1998 (DPA).

A few weeks ago fellow information rights blogger Tim Turner (who has given me permission to use the material) asked NHS England about the basis for Health Services Minister Dan Poulter’s statement in Parliament that

NHS England and the Health and Social Care Information Centre will work with the British Medical Association, the Royal College of General Practitioners, the Information Commissioner’s Office and with the Care Quality Commission to review and work with GP practices that have a high proportion of objections [to care.data] on a case-by-case basis

Tim wanted to know what role the ICO would play. NHS England replied saying, effectively, that they didn’t know, but they did disclose some minutes of a meeting held with the ICO in December 2013. Those minutes indicate that

The ICO had received a number of enquiries regarding bulk objections from practices. Their view was that adding objection codes would constitute processing of data in terms of the Data Protection Act.  If objection codes had been added without writing to inform their patients then the ICO’s view was that this would be unfair processing and technically a breach of the Act so action could be taken by the ICO

One must stress that this is not necessarily a complete or accurate respresentation of the ICO’s views. However, what appears to be being said here is that, if GPs took the decision to “opt out” their patients from care.data, without writing to inform them, this would be an act of “processing” according to the definition at section 1(1) of the DPA, and would not be compliant with the GPs’ obligations under the first DPA principle to process personal data fairly.

On a very strict reading of the DPA this may be technically correct – for processing of personal data to be fair data subjects must be informed of the purposes for which the data are being processed, and, strictly, adding a code which would prevent an upload (which would otherwise happen automatically) would be processing of personal data. And, of course, the “fairness” requirement is absent from the proposed care.data upload, because Parliament, in its wisdom, decided to give the NHS the legal power to override it. But “fairness” requires a broad brush, and the ICO’s interpretation here would have the distinctly odd effect of rendering unlawful a decision to maintain the status quo whereby patients’ GP data does not leave the confidential confines of their surgery. It also would have the effect of supporting NHS England’s apparent view that GPs who took such action would be liable to sanctions.

In fairness (geddit???!!) to the ICO, if a patient was opted out who wanted to be included in the care.data upload, then I agree that this would be in breach of the first principle, but it would be very easily rectified, because, as we know, it will be simple to opt-in to care.data from a previous position of “opt-out”, but the converse doesn’t apply – once your data is uploaded it is uploaded in perpetuity (see my last bullet point here).

A number of GPs (and of course, others) have expressed great concern at what care.data means for the confidential relationship between doctor and patient, which is fundamental for the delivery of health care. In light of those concerns, and in the absence of clarity about the secondary uses of patient data under care.data, would it really be “unfair” to patients if GPs didn’t allow the data to be collected? Is that (outwith DPA) fair to GPs?

Leave a comment

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, NHS

A balanced view on Optic Nerve

As I’m keen always to take a balanced view of important privacy issues, and not descend into the sort of paranoid raving which always defines, say, the state as the enemy, capable of almost anything, I sometimes think I end up being a bit naive, or at least having naive moments.

So, when outgoing Chair of Ofcom Dame Colette Bowe recently gave evidence to the House of Lords Select Committee on Communications, and said about consumers that

their smart TV may well have a camera and a microphone embedded in it there in their living room. What is that smart TV doing? Do people realise that this is a two-way street?

I thought for a moment “Oh come on, don’t be so scaremongering”. Sure, we saw the stories about Smart TVs and cookies, which is certainly an important privacy issue, but the idea that someone would use your TV to spy on you…?!

And then, of course, I quickly remembered – with a feeling of nausea – that that is exactly the sort of thing that GCHQ are alleged to have done, by jumping on the unencrypted web cam streams of Yahoo users, as part of the Optic Nerve program. And each time I remember this, it makes me want to scream “THEY WERE INDISCRIMINATELY SPYING ON PEOPLE…IN THEIR HOMES, IN THEIR BEDROOMS, FOR ****’S SAKE!”

And they were doing it just because they could. Because they’d notice a way – a vulnerability – and taken advantage of it to slurp masses of intensely private data, just in case it might prove useful in the future.

The intrusion, the prurience, the violation do indeed make me feel like raving against the state and its agents who, either through direct approval, or tacit acceptance, or negligence, allowed this to happen. Although *balance alert* GCHQ do, of course, assure us that “GCHQ insists all of its activities are necessary, proportionate, and in accordance with UK law”. So that’s OK. And yes, they really did call it “proportionate”. 

I know the web cam grabbing was by no means the only such intrusion, but for me it exemplifies the “something” which went wrong, at some point, which led to this. I don’t know what that something was, or even how to fix it, and I’ve never used a web cam, so have no direct interest, but I will closely watch the progress of Simon Davies’ request for the Attorney General to refer the matter to the police.

Leave a comment

Filed under Confidentiality, Data Protection, human rights, interception, Privacy, RIPA, surveillance

Hospital records sold to insurance companies – in breach of the Data Protection Act?

I’ve asked the ICO to assess whether the sale of millions of health records to insurance companies so that they could “refine” their premiums was compliant with the law

I’m about to disclose some sensitive personal data: I have been to hospital a few times over recent years…along with 47 million other people, whose records from these visits, according to reports in the media, were sold to an actuarial society for insurance premium purposes. The Telegraph reports

a report by a major UK insurance society discloses that it was able to obtain 13 years of hospital data – covering 47 million patients – in order to help companies “refine” their premiums.

As a result they recommended an increase in the costs of policies for thousands of customers last year. The report by the Staple Inn Actuarial Society – a major organisation for UK insurers – details how it was able to use NHS data covering all hospital in-patient stays between 1997 and 2010 to track the medical histories of patients, identified by date of birth and postcode.

I don’t know if this use of my sensitive personal data (if it was indeed my personal data) was in compliance with the Data Protection Act 1998 (DPA), although sadly I suspect that it was, but section 42 of the DPA allows a data subject to request the Information Commissioner to make an assessment as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions of the DPA. So that’s what I’ve done:

Hi

As a data subject with a number of hospital episodes over recent years I am disturbed to hear that the Hospital Episode Statistics (HES) of potentially 47 million patients were disclosed to Staple Inn Actuarial Society (SIAS), apparently for the purposes of helping insurance companies “refine” their premiums. I became aware of this through reports in the media (e.g. http://www.telegraph.co.uk/health/healthnews/10656893/Hospital-records-of-all-NHS-patients-sold-to-insurers.html). I am asking, pursuant to my right under section 42 of the Data Protection Act 1998, the ICO to assess whether various parts of this process were in compliance with the relevant data controllers’ obligations under the DPA:

1) I was not aware, until relatively recently, that HESs were provided to the HSCIC – was this disclosure by hospitals compliant with their DPA obligations?

2) Was the general processing (e.g. retention, manipulation, anonymisation, pseudonymisation) of this personal data compliant with HSCIC’s or, to the extent that HSCIC is a data processor to NHS England’s data controller, NHS England’s DPA obligations?

3) Was the disclosure of what appears to have been sensitive personal data (I note the broad definition of “personal data”, and your own guidance on anonymisation) to SIAS compliant with HSCIC’s (or NHS England’s) DPA obligations

4) Was SIAS’s subsequent processing of this sensitive personal data compliant with its DPA obligations?

You will appreciate that I do not have access to some information, so it may be that when I refer to HSCIC or NHS England or SIAS I should refer to predecessor organisations.

Please let me know if you need any further information to make this assessment.

with best wishes, Jon Baines

We’ve been told on a number of occasions recently that we shouldn’t be worried about our GP records being uploaded to HSCIC under the care.data initiative, because our hospital records have been used in this way for so long. Clare Gerada, former Chair of the Council of the Royal College of General Practitioners wrote in the BMJ that

for 25 years, hospital data have been handled securely with a suite of legal safeguards to protect confidentiality—the exact same safeguards that will continue to be applied when primary care data are added

Well, it seems to me that those legal safeguards might have failed to prevent (indeed, might have actively permitted) a breach involving 47 million records. I’m very interested to know what the Information Commissioner’s assessment will be.

UPDATE: 24 February 2014

An ICO spokesperson later said:

“We’re aware of this story, and will be gathering more information – specifically around whether the information had been anonymised – before deciding what action to take.”

UPDATE: 25 February 2014

At the Health Select Committee hearing into the care.data initiative HSCIC and NHS England representatives appeared not to know much about what data was disclosed, and in what circumstances, and effectively blamed NHSIC as a predecessor organisation. This echoed the statement from HSCIC the previous evening

The HSCIC believes greater scrutiny should have been applied by our predecessor body prior to an instance where data was shared with an actuarial society

UPDATE: 27 February 2014

GP and Clinical Lecturer Anne Marie Cunningham has an excellent post on what types of data were apparently disclosed by NHSIC (or HSCIC), and subsequently processed by, or on behalf, of SIAS. I would recommend reading the comments as well. It does seems to me that we may still be talking about pseudonymised personal data, which would mean that the relevant data controllers still had obligations under the DPA, and the ICO would have jurisdiction to investigate, and, if necessary, take regulatory action.

See also Tony Hirst’s blog posts on the subject . These are extremely complex issues, but, at a time when the future of the sharing and linking of health and other data is being hotly debated, and when the ICO is seeking feedback on its Anonymisation Code of Practice, they are profoundly important ones.

UPDATE: 14 March 2014

The ICO has kindly acknowledged receipt of my request for assessment, saying it has been passed to their health sector team for “further detailed consideration”.

UPDATE: 24 May 2014

Er, there is no real update. There was a slight hiccup, when the ICO told me it was not making an assessment because “[it] is already aware of this issue and is investigating them accordingly. Given that we do not necessarily require individual complaints to take consider taking further action your case is closed”. After I queried the legal basis for failing to make a section 42 assessment as requested, the position was “clarified”:

…we will make an assessment in relation to this case, however we are unable to do so at this present time…This is because the office is currently investigating whether, as alleged in the media, actual personal data has been shared by the HSCIC to various other organisations including Staple Inn, PA consulting and Google

I don’t criticise the ICO for taking its time to investigate: it involves a complicated assessment of whether the data disclosed was personal data. In a piece I wrote recently for the Society of Computers and Law I described the question of whether data is anonymous or not as a “profound debate”. And it is also highly complex. But what this delay, in assessing just one aspect of health data disclosure, does show, is that the arbitrary six-month delay to the implementation of care.data was never going to be sufficient to deal with all the issues, and sufficiently assure the public, and medical practitioners, to enable it to proceed. A vote on 23 May by the BMA’s Local Medical Committee’s conference emphatically illustrates this.

13 Comments

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, NHS, Privacy