Category Archives: data sharing

Health data breaches – missing the point?

Breaches of the DPA are not always about data security. I’m not sure NHS England have grasped this. Worse, I’m not sure the ICO understands public concern about what is happening with confidential medical information. They both need to listen.

Proponents of the initiative have been keen to reassure us of the safeguards in place for any GP records uploaded to the Health and Social Care Information Centre (HSCIC) by saying that similar data from hospitals (Hospital Episode Statistics, or HES) has been uploaded safely for about two decades. Thus, Tim Kelsey, National Director for Patients and Information in the National Health Service, said on twitter recently that there had been

No data breach in SUS*/HES ever

I’ve been tempted to point out that this is a bit like a thief arguing that he’s been stealing from your pockets for twenty years, so why complain when you catch him stealing from your wallet? However, whether Tim’s claim is true or not partly depends on how you define a “breach”, and I suspect he is thinking of some sort of inadvertent serious loss of data, in breach of the seventh (data security) principle of the Data Protection Act 1998 (DPA). Whether there have been any of those is one issue, and, in the absence of transparency of how HES processing has been audited, I don’t know how he is so sure (an FOI request for audit information is currently stalled, while HSCIC consider whether commercial interests are or are likely to prejudiced by disclosure). But data protection is not all about data security, and the DPA can be “breached” in other ways. As I mentioned last week, I have asked the Information Commissioner’s Office to assess the lawfulness of the processing surrounding the apparent disclosure of a huge HES dataset to the Institute and Faculty of Actuaries, whose Society prepared a report based on it (with HSCIC’s logo on it, which rather tends to undermine their blaming the incident on their NHSIC predecessors). My feeling is that this has nothing, or very little, to do with data security – I am sure the systems used were robust and secure – but a lot to do with some of the other DPA principles, primarily, the first (processing must be fair and lawful and have an appropriate Schedule 2 and Schedule 3 condition), and the second “Personal data shall be obtained only for one or more specified and lawful purposes”).

Since the story about the actuarial report, at least three other possible “breaches” have come to light. They are listed in this Register article, but it is the first that has probably caused the most concern. It appears that the entire HES dataset, pseudonymised (not, note, anonymised) of around one terabyte, was uploaded to Google storage, and processed using Big Query. An apparently rather unconcerned statement from HSCIC (maybe they’ll blame their predecessors again, if necessary) said

The NHS Information Centre (NHS IC) signed an agreement to share pseudonymised Hospital Episodes Statistics data with PA Consulting  in November 2011…PA Consulting used a product called Google BigQuery to manipulate the datasets provided and the NHS IC  was aware of this.  The NHS IC  had written confirmation from PA Consulting prior to the agreement being signed that no Google staff would be able to access the data; access continued to be restricted to the individuals named in the data sharing agreement

So that’s OK then? Well, not necessarily. Google’s servers (and, remember “cloud” really means “someone else’s computer”) are dotted around the world, although mostly in the US, and when you upload data to the cloud, one of the problems (or benefits) is you don’t have, or don’t tend to think you have, a real say in where it is hosted. By a certain argument, this even makes the cloud provider, in DPA terms, a data controller, because it is partly determining “the manner in which any personal data are, or are to be, processed”. If the hosting is outside the European Economic Area the eight DPA principle comes into play:

Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data

The rather excellent Tim Gough who is producing some incredibly helpful stuff on his site, has a specific page on DPA and the cloud and I commend it to you. Now, it may be that, because Google has conferred on itself “Safe Harbor” status, the eight principle is deemed to have been complied with, but I’m not sure it’s as straightforward because, in any case, Safe Harbor itself is of current questionable status and assurance.

I don’t know if PA Consulting’s upload of HES data to the cloud was in compliance with their and NHSIC’s/HSCIC’s DPA obligations, but, then again, I’m not the regulator of the DPA. So, in addition to last week’s request for assessment, I’ve asked the ICO to assess this processing as well

Hi again

I don’t yet have any reference number, but please note my previous email for reference. News has now emerged that the entire HES database may have been uploaded to some form of Google cloud storage. Would you also please assess this for compliance with the DPA? I am particularly concerned to know whether it was in compliance with the first, seventh and eighth data protection principle. This piece refers to the alleged upload to Google servers

best wishes,

However, I’m now genuinely concerned by a statement from the ICO, in response to the news that they are to be given compulsory powers of audit of NHS bodies. They say (in the context of the GP data proposed to be uploaded under the initiative)

The concerns around come from this idea that the health service isn’t particularly good at looking after personal information

I’m not sure if they’re alluding to their own concerns, or the public’s, but I think the statement really misunderstands the public’s worries about, and the use of medical data in general. From many, many discussions with people, and from reading more about this subject than is healthy, it seems to me that people have a general worry about, and objection to, their confidential medical information possibly being made available to commercial organisations, for the potential profit of the latter, and this concern stems from the possibility that this processing will lead to them being identified, and adversely affected by that processing. If the ICO doesn’t understand this, then I really think they need to start listening. And, that, of course, also goes for NHS England.

*“SUS” refers to HSCIC’s, and its predecessor, NHSIC’s Secondary Uses Service


Filed under, Data Protection, data sharing, Information Commissioner, NHS

Why no prison sentences for misuse of medical data?

So, the government, roused from its torpor by the public outrage at the proposals, and the apparent sale of 47 million patient records to actuaries, is said to be proposing, as a form of reassurance, amendments to the Care Bill. The Telegraph reports that

Jeremy Hunt will unveil new laws to ensure that medical records can only be released when there is a “clear health benefit” rather than for “purely commercial” use by insurers and other companies.

Ministers will also bolster criminal sanctions for organisations which breach data protection laws by disclosing people’s personal data. Under a “one strike and you’re out” approach, they will be permanently banned from accessing NHS data

One needs to be aware that this is just a newspaper report, and as far as I know it hasn’t been confirmed by the minister or anyone else in the government, but if it is accurate, I fear it shows further contempt for public concerns about the risks to the confidentiality of their medical records.

The first of the reported amendments sounds like a statutory backing to the current assurances that patient data will only be made available to third parties if it is for the purposes that will benefit the health and social care system (see FAQ 39 on the Guide for GP Practices). It also sounds like a very difficult piece of legislation to draft, and it will be very interesting to see what the proposed amendment actually says – will it allow secondary use for commercial purposes, as long as the primary use is for a “clear health benefit”? and, crucially, how on earth will it be regulated and enforced? (will properly resourced regulators be allowed to audit third parties’ use of data? – I certainly hope so).

The second amendment implies that the Data Protection Act 1998 (DPA) will also be amended. This also sounds like a difficult provision to draft: the Telegraph says

Those that have committed even one prior offence involving patient data will be barred from accessing NHS medical records indefinitely as part of a “one strike and you’re out” approach

But what do we mean by “offence”? The Telegraph falls into the common error of thinking that the Information Commissioner’s Office’s (ICO’s) powers to serve monetary penalty notices (MPNs) to a maximum of £500,000 are criminal justice powers; they are not – MPNs are civil notices, and the money paid is not a “fine” but a penalty. The only relevant current criminal offence in the DPA is that of (in terms) deliberately or recklessly obtaining or disclosing personal data without authority of the data controller. This is an either-way offence, which means it currently carries a maximum sanction of a £5000 fine in a magistrates court, or an unlimited fine in Crown Court (it is very rare for cases to be tried in the latter though). Prosecutions under this section (55) are generally brought against individuals, because the offence involves obtaining or disclosing the data without the authority of the data controller. It is unlikely that a company would commit a section 55 offence. More likely is that a company would seriously contravene the DPA in a manner which would lead to a (civil) MPN, or more informal ICO enforcement action. More likely still is simply that the ICO would have made a finding of “unlikely to have complied” with the DPA, under section 42 – a finding which carries little weight. Are prior civil or informal action, or a section 42 “unlikely to have complied” assessment going to count for the “one strike and you’re out” approach? And even if they are, what is to stop miscreant individuals or companies functioning through proxies, or agents? or even simply lying to get access to the data?

Noteworthy by its absence in the Telegraph reports of the proposed amendments was any reference to the one change to data protection law which actually might have a deterrent effect on those who illegally obtain or disclose personal data – the possibility of being sent to prison. As I and others have written before, all that is needed to achieve this is for the government to commence Section 77 of the Criminal Justice and Immigration Act 2008, which would create the power to alter the penalty (including a custodial sentence) for a section 55 DPA offence. However, the government has long been lobbied by certain sections of the press industry not to do so, because of apparent fears that it would give the state the power to imprison investigative journalists (despite the fact that section 78 of the Criminal Justice Act 2008 – also uncommenced – creating a new defence for journalistic, literary or artistic purposes). The Information Commissioner has repeatedly called for the law to be changed so that there is a real sanction for serious criminal data protection offences, but to no avail.

Chris Pounder has argued that the custodial sentence provisions (discussion of which was kicked into the long grass which grew up in the aftermath of the Leveson inquiry) might never be introduced. Despite the calls for such strong penalties for misuse of medical data, from influential voices such as Ben Goldacre, the proposals for change outlined by the Telegraph seem to support Dr Pounder’s view.

One of the main criticisms of the disastrous public relations and communications regarding the initiative is that people’s acute concerns about the security of their medical records have been dismissed with vague or misleading reassurances. With the announcement of these vague and probably ineffectual proposed legal sanctions, what a damned shame that that looks to be continuing.


Filed under, Data Protection, data sharing, Information Commissioner, Leveson, monetary penalty notice, NHS

Hospital records sold to insurance companies – in breach of the Data Protection Act?

I’ve asked the ICO to assess whether the sale of millions of health records to insurance companies so that they could “refine” their premiums was compliant with the law

I’m about to disclose some sensitive personal data: I have been to hospital a few times over recent years…along with 47 million other people, whose records from these visits, according to reports in the media, were sold to an actuarial society for insurance premium purposes. The Telegraph reports

a report by a major UK insurance society discloses that it was able to obtain 13 years of hospital data – covering 47 million patients – in order to help companies “refine” their premiums.

As a result they recommended an increase in the costs of policies for thousands of customers last year. The report by the Staple Inn Actuarial Society – a major organisation for UK insurers – details how it was able to use NHS data covering all hospital in-patient stays between 1997 and 2010 to track the medical histories of patients, identified by date of birth and postcode.

I don’t know if this use of my sensitive personal data (if it was indeed my personal data) was in compliance with the Data Protection Act 1998 (DPA), although sadly I suspect that it was, but section 42 of the DPA allows a data subject to request the Information Commissioner to make an assessment as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions of the DPA. So that’s what I’ve done:


As a data subject with a number of hospital episodes over recent years I am disturbed to hear that the Hospital Episode Statistics (HES) of potentially 47 million patients were disclosed to Staple Inn Actuarial Society (SIAS), apparently for the purposes of helping insurance companies “refine” their premiums. I became aware of this through reports in the media (e.g. I am asking, pursuant to my right under section 42 of the Data Protection Act 1998, the ICO to assess whether various parts of this process were in compliance with the relevant data controllers’ obligations under the DPA:

1) I was not aware, until relatively recently, that HESs were provided to the HSCIC – was this disclosure by hospitals compliant with their DPA obligations?

2) Was the general processing (e.g. retention, manipulation, anonymisation, pseudonymisation) of this personal data compliant with HSCIC’s or, to the extent that HSCIC is a data processor to NHS England’s data controller, NHS England’s DPA obligations?

3) Was the disclosure of what appears to have been sensitive personal data (I note the broad definition of “personal data”, and your own guidance on anonymisation) to SIAS compliant with HSCIC’s (or NHS England’s) DPA obligations

4) Was SIAS’s subsequent processing of this sensitive personal data compliant with its DPA obligations?

You will appreciate that I do not have access to some information, so it may be that when I refer to HSCIC or NHS England or SIAS I should refer to predecessor organisations.

Please let me know if you need any further information to make this assessment.

with best wishes, Jon Baines

We’ve been told on a number of occasions recently that we shouldn’t be worried about our GP records being uploaded to HSCIC under the initiative, because our hospital records have been used in this way for so long. Clare Gerada, former Chair of the Council of the Royal College of General Practitioners wrote in the BMJ that

for 25 years, hospital data have been handled securely with a suite of legal safeguards to protect confidentiality—the exact same safeguards that will continue to be applied when primary care data are added

Well, it seems to me that those legal safeguards might have failed to prevent (indeed, might have actively permitted) a breach involving 47 million records. I’m very interested to know what the Information Commissioner’s assessment will be.

UPDATE: 24 February 2014

An ICO spokesperson later said:

“We’re aware of this story, and will be gathering more information – specifically around whether the information had been anonymised – before deciding what action to take.”

UPDATE: 25 February 2014

At the Health Select Committee hearing into the initiative HSCIC and NHS England representatives appeared not to know much about what data was disclosed, and in what circumstances, and effectively blamed NHSIC as a predecessor organisation. This echoed the statement from HSCIC the previous evening

The HSCIC believes greater scrutiny should have been applied by our predecessor body prior to an instance where data was shared with an actuarial society

UPDATE: 27 February 2014

GP and Clinical Lecturer Anne Marie Cunningham has an excellent post on what types of data were apparently disclosed by NHSIC (or HSCIC), and subsequently processed by, or on behalf, of SIAS. I would recommend reading the comments as well. It does seems to me that we may still be talking about pseudonymised personal data, which would mean that the relevant data controllers still had obligations under the DPA, and the ICO would have jurisdiction to investigate, and, if necessary, take regulatory action.

See also Tony Hirst’s blog posts on the subject . These are extremely complex issues, but, at a time when the future of the sharing and linking of health and other data is being hotly debated, and when the ICO is seeking feedback on its Anonymisation Code of Practice, they are profoundly important ones.

UPDATE: 14 March 2014

The ICO has kindly acknowledged receipt of my request for assessment, saying it has been passed to their health sector team for “further detailed consideration”.

UPDATE: 24 May 2014

Er, there is no real update. There was a slight hiccup, when the ICO told me it was not making an assessment because “[it] is already aware of this issue and is investigating them accordingly. Given that we do not necessarily require individual complaints to take consider taking further action your case is closed”. After I queried the legal basis for failing to make a section 42 assessment as requested, the position was “clarified”:

…we will make an assessment in relation to this case, however we are unable to do so at this present time…This is because the office is currently investigating whether, as alleged in the media, actual personal data has been shared by the HSCIC to various other organisations including Staple Inn, PA consulting and Google

I don’t criticise the ICO for taking its time to investigate: it involves a complicated assessment of whether the data disclosed was personal data. In a piece I wrote recently for the Society of Computers and Law I described the question of whether data is anonymous or not as a “profound debate”. And it is also highly complex. But what this delay, in assessing just one aspect of health data disclosure, does show, is that the arbitrary six-month delay to the implementation of was never going to be sufficient to deal with all the issues, and sufficiently assure the public, and medical practitioners, to enable it to proceed. A vote on 23 May by the BMA’s Local Medical Committee’s conference emphatically illustrates this.


Filed under, Confidentiality, Data Protection, data sharing, Information Commissioner, NHS, Privacy

Big Pharma and

Patients’ identifiable medical data will end up in the hands of large pharmaceutical companies, under the initiative. With “Big Pharma” beholden to shareholders, and its abysmal record on transparency, is this another reason to consider opting out?

We are often told by those publicly defending the programme (I’m thinking particularly of NHS Chief Data Officer Geraint Lewis, and NHS National Director for Patients and Information Tim Kelsey, who at least are prepared to engage with critics – although the latter has a habit of resorting to personal attacks at times) that patients’ identifiable/amber/pseudonymised data will not be made available to commercial organisations to use for their own purposes. So, we are told, it cannot be used for the purposes of selling or administering any kind of insurance, or for marketing purposes. As the pdf of FAQs, to which we are often referred (by Geraint in particular) says

Potentially identifiable data – these data do not include identifiers but may be considered identifiable (e.g. due to a patient in an area having a rare disease or a rare combination of characteristics). There are strict controls around the limited release of such data. For example, there must be a contract in place, the data are only released to approved organisations, and restricted to a specific purposes that will benefit the health and social care system
Let’s ignore for now the awkward question of how these restrictions can effectively be enforced. Let’s also ignore the fact that this data will not simply be “released” – organisations will pay for it, and a commercial organisation, with fiduciary obligations to its owners or shareholders, is not going to pay for something unless there is potential financial benefit.
What I wanted to highlight is that purposes that will benefit the health and social care system will generally boil down to two things: commissioning of services, and research. Regarding the latter, as the NHS Health Research Authority says this can take many forms, and be undertaken by many different bodies, but it will be no big revelation if I point out that vast amounts of research are conducted by, or under the control of, huge pharmaceutical companies – Big Pharma. Doctor and journalist Ben Goldacre has been campaigning for a number of years, following up the lead of others such as Iain Chalmers to expose the fact that an enormous amount of data and results from research – specifcally, admittedly, of clinical trials – is withheld by Big Pharma. This led to the setting-up of the AllTrials campaign. As Ben said, on the publication of a damning report by the Public Accounts Committee into the withholding of trial results for Tamiflu
[the] report is a complete vindication of AllTrials’ call for all the results, of all the trials, on all the uses of all currently prescribed treatments. None of the proposed new legislation or codes of conduct come anywhere close to this simple, vital ask. Industry has claimed it is on the verge of delivering transparency for over two decades. While obfuscating and delaying, ever more results have been withheld. Some in industry now claim that results from even a decade ago may be lost and inaccessible. This is both implausible and unacceptable…We cannot make informed decisions about which treatment is best when vitally important information is routinely and legally kept secret. Future generations will look back at this absurd situation in the same way that we look back on mediaeval bloodletting
This is the same industry which will be able to purchase patients’ identifiable medical data, uploaded from their GP records for research purposes. Will the NHS ever see the results of this research if, for instance, those results could have a potentially adverse effect on the companies’ share prices? Will there be any legal or contractual mechanisms in place to ensure that we don’t see similar obfuscating and delaying, and withholding of results?
Is it really the insurance and marketing companies we need to worry about?

Leave a comment

Filed under, Confidentiality, data sharing, NHS, Privacy

The leaflet campaign – legally necessary?

Readers of this blog [sometimes I imagine them1] may well be fed up with posts about (see here, here and here). But this is my blog and I’ll cry if I want to. So…

Doyen of information rights bloggers, Tim Turner, has written in customary analytic detail on how the current NHS leafleting campaign was not necessitated by data protection law, and on how, despite some indications to the contrary, GPs will not be in the Information Commissioner’s firing line if they fail adequately to inform patients about what will be happening to their medical data.

He’s right, of course: where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) then it is not obliged, pace the otherwise very informative blogpost by the Information Commissioner’s Dawn Monaghan, to give data subjects a privacy, or fair processing notice.

(In passing, and in an attempt to outnerd the unoutnerdable, I would point out that Tim omits that, by virtue of The Data Protection (Conditions under Paragraph 3 of Part II of Schedule 1) Order 2000, if a data subject properly requests a privacy notice in circumstances where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) and would, thus, otherwise not be required to issue one, the data controller must comply2.)

Tim says, though

The leaflet drop is no way to inform people about such a significant step, but I don’t think it is required

That appears to be true, under data protection law, but, under broader obligations imposed on the relevant authorities under Article 8 of the European Convention on Human Rights (ECHR), as incorporated in domestic law in the Human Rights Act 1998, it might not be so (and here, unlike with data protection law, we don’t have to consider the rigid controller/processor dichotomy in order to decide who the relevant, and liable, public authority is, and I would suggest that NHS England (as the “owner of the programme” in Dawn Monaghan’s words) seems the obvious candidate, but GPs might also be caught).

In 1997 the European Court of Human Rights addressed the very-long-standing concept of the confidentiality of doctor-patient relations, in the context of personal medical data, in Z v Finland (1997) 25 EHRR 371, and said

the Court will take into account that the protection of personal data, not least medical data, is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life as guaranteed by Article 8 of the Convention (art. 8). Respecting the confidentiality of health data is a vital principle in the legal systems of all the Contracting Parties to the Convention. It is crucial not only to respect the sense of privacy of a patient but also to preserve his or her confidence in the medical profession and in the health services in general…Without such protection, those in need of medical assistance may be deterred from revealing such information of a personal and intimate nature as may be necessary in order to receive appropriate treatment and, even, from seeking such assistance, thereby endangering their own health and, in the case of transmissible diseases, that of the community

This, I think, nicely encapsulates why so many good and deep-thinking people have fundamental concerns about

Now, I am not a lawyer, let alone a human rights lawyer, but it does occur to me that a failure to inform patients about what would be happening with their confidential medical records when GP’s were required to upload them, and a failure to allow them to opt-out, would have potentially infringed patients’ Article 8 rights. We should not forget that, initially, there was no intention to inform patients at all (there had no attempt to inform patients about the similar upload of hospital medical data, which has been going on for over twenty years). It is, surely, possible therefore, that NHS England is not just “helping” GPs to inform patients without having any responsibility to do so (as Dawn Monaghan suggests), but that it recognises its potential vulnerability to an Article 8 challenge, and is trying to avoid or mitigate this. Whether the leaflets themselves, and the campaign to deliver them, are adequate to achieve this aim is another matter. As has been noted, the leaflet contains no opt out form, and there seem to be numerous examples of people (often vulnerable people, for instance in care homes, or refuges) who will have little or no chance of receiving a copy.

At the launch of the tireless MedConfidential campaign last year, Shami Chakrabarti, of Liberty, spoke passionately about the potential human rights vulnerabilities of the programme. Notifying patients of what is proposed might not have been necessary under data protection law, but it is quite possible that the ECHR aspect of doing so was one of the things on which the Health and Social Care Information Centre (HSCIC) has been legally advised. Someone made an FOI request for this advice last year, and it is notable that HSCIC seem never to have completed their response to the request.

1I make no apologies for linking to one of Larkin’s most beautiful, but typically bleak and dystopian, pieces of prose, but I would add that it finishes “…These have I tried to remind of the excitement of jazz, and tell where it may still be found.”

2Unless the data controller does not have sufficient information about the individual in order readily to determine whether he is processing personal data about that individual, in which case the data controller shall send to the individual a written notice stating that he cannot provide the requisite information because of his inability to make that determination, and explaining the reasons for that inability


Filed under, Confidentiality, Data Protection, data sharing, Europe, human rights, Information Commissioner, NHS, Privacy – what am I worried will happen?

I was invited today on twitter to say what I was worried will happen as a result of the programme. I’ve written about this previously, and some of my concerns are laid out in those posts. But here’s a little list:

  • I am worried that even the most robust and secure data security measures can fail, or be overridden. Patients’ identifiable data could be compromised.
  • I am worried that there is a limit to how much users of the data could be restrained from making secondary, not-beneficial-to-patients, usage of data to which they are given access (Geraint Lewis, NHS Chief Data officer, was asked how, for instance, insurance companies would be prevented from doing this – he pointed to the Information Commissioner’s powers to impose Monetary Penalty Notices to a maximum of £500,000 for suitably serious contraventions of the Data Protection Act 1998. But a penalty for misuse of data will only be a net penalty if it outstrips profit from the usage.)
  • I am worried that some people will avoid seeking medical treatment, particularly for sensitive or serious ailments, if they in turn worry about who might have access to their data.
  • I am, in more general terms, worried about the lack of transparency that has surrounded the programme, and the lack of clear information. I am worried that, if the risks are so low and the benefits so high, why were initial attempts made to sneak this under the public’s radar?
  • I am worried that the amassing of and use of personal data in itself carries risks.
  • I am worried that I am wrong about all this, and that I am attacking a programme which will potentially deliver personal and societal benefits.

But, ultimately, I am not sure it is for me to say specifically what I am worried will happen. I don’t know specifically what will happen with a lot of things I worry about.

Surely it is for the proponents of to say why I should be reassured. And I’m not.


Filed under, Confidentiality, Data Protection, data sharing, Information Commissioner

Why I’ve opted-out of

Last week, after months of (over)thinking about it, I sent my GP a letter, based on the excellent template by the tireless MedConfidential refusing consent for identifiable data from my electronic medical records to be transferred to the Health and Social Care Information Centre (HSCIC).

I won’t rehearse the eloquent arguments against the current proposals that you can read on MedConfidential’s site, and elsewhere (for instance GP Neil Bhatia’s excellent site). Nor will I rehearse arguments in favour. I have written about the subject in the past, and I don’t want to add to the general clamour. What I do want to say is why I have opted-out:

  • I’ve been struck by the inaccuracy and disingenuousness of the information which is being given to us in support of We are told, for instance, that “Your date of birth, full postcode, NHS Number and gender rather than your name will be used to link your records in a secure system, managed by the HSCIC. Once this information has been linked, a new record will be created. This new record will not contain information that identifies you”. This is cleverly worded: it does not say (because it would not be true) that this data will be anonymised, but it certainly tries to give that impression.
  • I have, ever since I first became aware of this issue, noted that there has been a lack of openness on the part of proponents. This has manifested itself in many ways, and people should be aware that the current leafleting campaign (as flawed as it is – note that it is not personally addressed to individuals, but simply sent to households, and doesn’t contain a form enabling people to opt-out) would not have come about were it not for concerns raised about this lack of openness.
  • I’ve noted the emotive campaign launched by leading charities in support of the campaign. But I’ve also noted the response by MedConfidential  which highlights that the charities’ campaign doesn’t draw attention to secondary usage of the information gathered, which could potentially be by pharmaceutical and other commerical companies, universities and other academic organisations, information intermediaries and think-tanks. On a general level, I do not think that amassing of personal data can ever be without potential risks and drawbacks, some of which include – the risk of breaches of data security, the risk of people failing to seek medical advice because of privacy fears, commercial use – and none of which are addressed in the charities’ campaign.
  • Finally, and, for me, crucially, if I fail to opt-out now, I’ve lost my chance – my data once uploaded cannot be deleted. However, opting out now does not preclude opting in in future. So, should I subsequently become convinced that societal and individual benefits from this amassing of electronic personal data outweigh my strong concerns about privacy and consent, I can change my mind in a way I couldn’t if I failed to opt out now.


Filed under Confidentiality, Data Protection, data sharing, Privacy

Let’s Blame Data Protection – the Gove files

Thanks to Tim Turner, for letting me blog about the FOI request he made which gives rise to this piece

On the 12th September the Education Secretary, Michael Gove, in an op-ed piece in the Telegraph, sub-headed “No longer will the quality, policies and location of care homes be kept a secret” said

A year ago, when the first shocking cases of sexual exploitation in Rochdale were prosecuted, we set up expert groups to help us understand what we might do better…Was cost a factor? Did we need to spend more? There was a lack of clarity about costs. And – most worrying of all – there was a lack of the most basic information about where these homes existed, who was responsible for them, and how good they were….To my astonishment, when I tried to find out more, I was met with a wall of silence

And he was in doubt about where the blame lay (no guesses…)

The only responsible body with the information we needed was Ofsted, which registers children’s homes – yet Ofsted was prevented by “data protection” rules, “child protection” concerns and other bewildering regulations from sharing that data with us, or even with the police. Local authorities could only access information via a complex and time-consuming application process – and some simply did not bother…[so] we changed the absurd rules that prevented information being shared

This seemed a bit odd. Why on earth would “data protection” rules prevent disclosure of location, ownership and standards of children’s homes? I could understand that there were potentially child protection concerns in the too-broad-sharing of information about locations (and I don’t find that “bewildering”) but data protection rules, as laid out in the Data Protection Act 1998 (DPA), only apply to information relating to identifiable individuals. This seemd odd, and Tim Turner took it upon himself to delve deeper. He made a freedom of information request to the Department for Education, asking

1) Which ‘absurd’ rules was Mr. Gove referring to in the first

2) What changes were made that Mr. Gove referred to in the second

3) Mr Gove referred to ‘Data Protection’ rules. As part of the
process that he is describing, has any problem been identified with
the Data Protection Act?

Fair play to the DfE – they responded within the statutory timescales, explaining

Regulation 7(5) of the Care Standards Act 2000 (Registration) (England) Regulations 2010 …prohibited Ofsted from disclosing parts of its register of children’s homes to any body other than to a local authority where a home is located. Whatever the original intention behind this limitation, it represented a barrier preventing Ofsted from providing information about homes’ locations to local police forces, which have explicit responsibilities for safeguarding all children in their area…we introduced an amendment to Regulation 7 with effect from April 2013

But their response also revealed what had been very obvious all along: this had nothing to do with data protection rules:

the reference to “data protection” rules in Mr Gove’s article involved the Regulations discussed above, made under section 36 of the Care Standards Act 2000. His comments were not intended as a reference to the Data Protection Act 1998

This is disingenuous: “data protection” has a very clear and statutory context, and to extend it to more broadly mean “information sharing” is misleading and pointless. One could perhaps understand it if Gove had said this in an oral interview, but his piece will have been checked carefully before publication, and personally I am in no doubt that blaming data protection has a political dimension. The government is determined, for some right reasons, and some wrong ones, to make the sharing of public sector data more easy, and data protection does, sometimes – and rightly – present an obstacle to this, when the data in question is personal data and the sharing is potentially unfair or unlawful. Anything which associates “data protection” with a risk to child safety, serves to represent it as bureaucratic and dangerous, and serves the government agenda.

And the rather delicious irony of all this – as pointed out on twitter by Rich Greenhill – is that the “absurd rules” (the Care Standards Act 2000 (Registration) (England) Regulations 2010) criticised by Gove were made on 24 August 2010. And the Secretary of State who made these absurd rules was, of course, the Right Honourable Michael Gove MP.

How absurd.

Leave a comment

Filed under Data Protection, data sharing, Freedom of Information, Let's Blame Data Protection, transparency

An unshared perspective

Paul Gibbons, FOI Man, has blogged about data-sharing, questioning whether an over-cautious approach to sharing of health data is damaging. Paul says

What I’m increasingly worried about is what appears to be a widely held and instinctive view that any sharing of personal data – and even data that has been anonymised – is necessarily a “bad thing”.

I’ve got to say, in all the time I’ve worked in the field of information rights I’ve never come across anyone who actually thinks that, let alone articulates it (in my experience the only people who say it are those who seek to misrepresent it). The Data Protection Act 1998 (DPA) and EC Directive 95/46/EC to which it gives effect do not act as a default bar to sharing of data. There may be circumstances under which compliance with the law means that sharing of personal data cannot happen, but the converse is true – there will be times when sharing is lawful, necessary and proportionate.

Paul’s prime example of what he sees as (to adopt the title of his piece) “a disproportionate fear of ‘Big Brother’” preventing us from seeing the big picture” is the “predictable outcry” about the care:data programme, whereby the Health and Social Care Information Centre will, through the exercise of certain provisions in the Health and Social Care Act 2012, extract enormous amounts of health and social care information from local systems to centralised ones. The first step in this is the GP Extraction Service (GPES) whereby information relating to medical conditions, treatments and diagnosis, with each patient’s NHS number, date of birth, postcode, gender, ethnicity and other information will be uploaded routinely. The information will then be made available to a range of organisations, sometimes including private companies, sometimes in ostensibly anonymised, sometimes in identifiable, form, for a variety of purposes. This will happen to your medical records unless you opt-out (and if you think you’ve already done so, you probably haven’t – those who objected to the creation of a summary care record will have to go through another opt-out process). And this week we were informed that there will be no national campaign to alert patients to the GPES – the responsibility (and liability) will lie with GP practices themselves. (Anyone wanting to understand this complex and less-than-transparent process must read and follow the superb MedConfidential).

I accept that, on one view, this amassing of health and social care data could be seen as a good thing: as Paul suggests, medical research, for instance is a hugely important area. And the NHS Commissioning Board identifies the following desired outcomes from care:data

– support patients’ choice of service provider and treatment by making comparative data publicly available
– advance customer services, with confidence that services are planned around the patient
– promote greater transparency, for instance in support of local service planning
– improve outcomes, by monitoring against the Outcomes Frameworks
– increase accountability in the health service by making data more widely available
– drive economic growth through the effective use of linked data

But how realistic are these? And what are the attendant risks or detriments? Paul says

central medical records for all NHS patients…would mean that when you turned up at a hospital far from home, as I have done myself, doctors would have access to your medical records and history. Believe me, when you are in pain and desperate to be treated, the last thing that you want to do is to answer questions about your medical history

With great respect, the ideal of a centralised system whereby medics can provide emergency treatment to patients by accessing electronic records is never going to be more than a myth. Put another way – would Paul be happy trusting his life to the accuracy of an electronic record that might or might not say, for instance, whether he is allergic to aspirin? Treatment of patients is a matter of diagnosis, and emergency diagnoses will never be made solely, if at all, on the basis of records.

Security of information, and risks of identification of individuals are other key concerns. Paul says Daniel Barth-Jones identifies “deficiencies in [reidentification] studies” but I think what Barth-Jones is actually arguing is that the risks of reidentification are real, but they must be accurately reported and balanced against the likelihood of their happening.

But ultimately I have two major conceptual concerns about care:data and what it implies. The first is that, yes, I am instinctively distrusting of agglomeration of sensitive personal data in identifiable form in mass processing systems: history has taught us to be this way so I don’t see this, as Paul appears to, as a “fashionable” mistrust (and, for instance, the Joseph Rowntree Foundations’ exemplary Database State report is now over six years old). The second is that patient-medic confidentiality exists, and has existed for a very long time, for a reason: if patients are not certain that their intimate medical details are confidential, they might be reluctant to speak candidly to their doctor. In fact, they might not even visit their doctor at all.


Filed under Confidentiality, Data Protection, data sharing, human rights, Let's Blame Data Protection

It’s still not fine

Last week I blogged about enforcement notices served on three Midlands police forces by the Information Commissioner (IC). I was surprised that the circumstances hadn’t merited stronger sanctions, in the form of monetary penalty notices (MPNs), and I tweeted to ask why.

As you can perhaps see, the IC’s office has kindly replied to my tweet. I had asked

I would really like to know why the IC did not see fit to issue Monetary Penalty Notices. Can you advise?

and their reply says

enforcement notices best means of improving compliance. Considered details of the case inc limited involvement of each force

I have to say I think this is a questionable response (although I take the point that a 140-character limit is restrictive).

Firstly, enforcement activities are not mutually exclusive – it is not uncommon for an enforcement notice and an MPN to be served in tandem on a data controller. thus, as recently as June this year, Glasgow City Council was served an MPN of £150,000 by the IC following the loss of, er, unencrypted laptops, and at the same time was served an enforcement notice requiring certain corrective actions to be undertaken.

Secondly, and I may be misinterpreting, but the reply seems to say that the “limited involvement of each force” was a determining factor in a decision not to serve an MPN. However, there were three data controllers involved. If each of them had a “limited” involvement, one is led to ask “wasn’t that the main problem?”. Derbyshire and Leicestershire both “did not carry out a risk assessment before they joined [the collaboration unit]…relying on the security measures taken by Nottinghamshire“, but those security measures were inadequate (lack of encryption, laptops not physically secured). Meanwhile, none of the forces properly monitored its officers while they were seconded.

It seems to me that the limited involvement of each of the forces might, instead of excusing it, have in fact been the key factor why the security breach happened.

Principle seven of the first schedule to the Data Protection Act 1998 (DPA) requires that

Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data

Many many public (and private) sector data controllers are undertaking collaborative and partnership working, or are taking steps to do so. All responsible organisations are very aware, where they continue, either jointly or in common with other organisations, to determine the purposes for which and the manner in which any personal data are, or are to be, processed, that they remain a data controller, with the consequent responsibilities and liabilities. They are very aware of the IC’s Data Sharing Code of Practice.

And they are very aware that, if things go wrong with data-sharing, it will not normally be sufficient to point at a partner, and say “it was their fault”, or, even less, for all partners to shrug their shoulders and say, “that wasn’t our responsibility”.

Leave a comment

Filed under Data Protection, data sharing, enforcement, Information Commissioner, monetary penalty notice, police, Uncategorized