Category Archives: NHS

This old world will never change

Complacency about data protection in the NHS won’t change unless ICO takes firm action

Back in September 2016 I spoke to Vice’s Motherboard, about reports that various NHS bodies were still running Windows XP, and I said

If hospitals are knowingly using insecure XP machines and devices to hold and otherwise process patient data they may well be in serious contravention of their [data protection] obligations

Subsequently, in May this year, the Wannacry exploit indicated that those bodies were indeed vulnerable, with multiple NHS Trusts and GP practices subject to ransomware demands and major system disruption.

That this had enormous impact on patients is evidenced by a new report on the incident from the National Audit Office (NAO), which shows that

6,912 appointments had been cancelled, and [it is] estimated [that] over 19,000 appointments would have been cancelled in total. Neither the Department nor NHS England know how many GP appointments were cancelled, or how many ambulances and patients were diverted from the five accident and emergency departments that were unable to treat some patients

The NAO investigation found that the Department of Health and the Cabinet Office had written to Trusts

saying it was essential they had “robust plans” to migrate away from old software, such as Windows XP, by April 2015. [And in] March and April 2017, NHS Digital had issued critical alerts warning organisations to patch their systems to prevent WannaCry

Although the NAO report is critical of the government departments themselves for failure to do more, it does correctly note that individual healthcare organisations are themselves responsible for the protection of patient information. This is, of course, correct: under the Data Protection Act 1998 (DPA) each organisation is a data controller, and responsible for, among other things, for ensuring that appropriate technical and organisational measures are taken against unauthorised or unlawful processing of personal data.

Yet, despite these failings, and despite the clear evidence of huge disruption for patients and the unavoidable implication that delays in treatment across all NHS services occurred, the report was greeted by the following statement by Keith McNeil, Chief Clinical Information Officer for NHS England

As the NAO report makes clear, no harm was caused to patients and there were no incidents of patient data being compromised or stolen

In fairness to McNeil, he is citing the report itself, which says that “NHS organisations did not report any cases of harm to patients or of data being compromised or stolen” (although that is not quite the same thing). But the report continues

If the WannaCry ransomware attack had led to any patient harm or loss of data then NHS England told us that it would expect trusts to report cases through existing reporting channels, such as reporting data loss direct to the Information Commissioner’s Office (ICO) in line with existing policy and guidance on information governance

So it appears that the evidence for no harm arising is because there were no reports of “data loss” to the ICO. This emphasis on “data loss” is frustrating, firstly because personal data does not have to be lost for harm to arise, and it is difficult to understand how delays and emergency diversions would not have led to some harm, but secondly because it is legally mistaken: the DPA makes clear that data security should prevent all sorts of unauthorised processing, and removal/restriction of access is clearly covered by the definition of “processing”.

It is also illustrative of a level of complacency which is deleterious to patient health and safety, and a possible indicator of how the Wannacry incidents happened in the first place. Just because data could not be accessed as a result the malware does not mean that this was not a very serious situation.

It’s not clear whether the ICO will be investigating further, or taking action as a result of the NAO report (their response to my tweeted question – “We will be considering the contents of the report in more detail. We continue to liaise with the health sector on this issue” was particularly unenlightening). I know countless dedicated, highly skilled professionals working in the fields of data protection and information governance in the NHS, they’ve often told me their frustrations with senior staff complacency. Unless the ICO does take action (and this doesn’t necessarily have to be by way of fines) these professionals, but also – more importantly – patients, will continue to be let down, and in the case of the latter, put at the risk of harm.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under 7th principle, Data Protection, data security, enforcement, Information Commissioner, NHS

Praise where it’s due, but the senior people aren’t listening

A few months ago I had to attend a clinic at a large hospital (nothing embarrassing, nothing serious, but I’m not going to disclose my sensitive personal data). Said hospital is, as are so many these days, crumbling under a lack of resources. In the past I’ve been to other clinics at the same hospital and been concerned to note that they are often run from areas that are little better than corridors, with no real physical data security measures in place – files left out on tables, computer screens open to view by bystanders etc.

However, on this occasion as I approached the healthcare assistant – let’s call her “Anne” – who appeared to be running the clinic (sure enough effectively in a corridor), I notice she kept the clinic list carefully shielded from my eyes, and when I gave my name she retrieved my file from a row of all the others hidden under a long strip of blue hospital paper (you know, the stuff on big rolls like kitchen towels).

I said how impressed I was at her simple but effective attempt to protect patient confidentiality under difficult circumstances, and said I was chairman of NADPO so knew a bit whereof I spoke. A little bit later Anne called me from my seat and I thought it was to take me to my appointment. However, she took me to her manager, and they explained that Anne had previously been criticised by one of the clinic consultants, who felt the blue paper was inconveniencing him, and who would at times remove it and throw it away.

So, I thought I’d write a letter – to the Chief Executive of the NHS Trust, copied to its Medical Records Manager, and Anne herself – praising her actions.

I completely forgot about it but yesterday out of nowhere received a card. It was from Anne saying that she’d received my copy letter, although she hadn’t heard from anyone else (not the Chief Executive nor the Medical Records Manager). She said that the letter was the nicest thing that had happened to her at work in 16 years.

I think this illustrates several things: 1) the NHS, and the public sector in general, are overstretched and confidentiality is potentially compromised as a result, 2) even in times of austerity low-cost information security measures can be effectively implemented, 3) sometimes people lower down are frustrated by, or even undermined by, those above them, 4) compliments are enormously valuable, and too rarely offered.

But there’s one final point. Anne had said in her card to me “I hope [the Chief Executive] wrote and thanked you”. Well no, she didn’t. And nor did the Medical Records Manager nor anyone else in the Hospital Trust. Only Anne had the courtesy to do so, and she was not the one who the message needed to get through to. I’d like to name (and slightly shame) the Trust, but I’d then identify “Anne”, and I don’t want to do that.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Confidentiality, Data Protection, NHS

No data protection “fines” for audited NHS bodies

UPDATE: 03.02.15 GPOnline have commendably now amended their piece on this END UPDATE

GPOnline warns its readers today (02.02.15) that

GP practices face compulsory audits from this month by the information commissioner to check their compliance with data protection laws, and could be fined heavily if they are found to have breached rules.

While it’s good that it is on the ball regarding the legal change to the Information Commissioner’s Office (ICO) audit powers, it is, in one important sense, wrong: I can reassure GP practices that they are not risking “fines” (more correctly, monetary penalty notices, or MPNs) if breaches of the law are found during an ICO audit. In fact, the law specifically bars the ICO from serving an MPN on the basis of anything discovered in the process of an audit.

Under s41A of the Data Protection Act 1998 (DPA) the ICO can serve a data controller with a notice “for the purpose of enabling the Commissioner to determine whether the data controller has complied or is complying with the data protection principles”. Until yesterday, this compulsory audit power was restricted to audits of government departments. However, the Data Protection (Assessment Notices) (Designation of National Health Service Bodies) Order 2014, which commenced on 1 February 2015, now enables the ICO to perform mandatory data protection audits on NHS bodies specified in the schedule to the Order.  Information Commissioner Christopher Graham has said

We fine these organisations when they get it wrong, but this new power to force our way into the worst performing parts of the health sector will give us a chance to act before a breach happens

And I think he chose those words carefully (although he used the legally inaccurate word “fine” as well). Section 55A of the DPA gives the ICO the power to serve a monetary penalty notice, to a maximum of £500,000, if he is “satisfied” that – there has been a serious contravention of the DPA by the data controllers and it was of a kind likely to cause substantial damage or substantial distress and the data controller knew or ought to have known that this would happen. However section 55A(3A) provides that the ICO may not be so “satisfied”

by virtue of any matter which comes to the Commissioner’s attention as a result of anything done in pursuance of…an assessment notice

This policy reason behind this provision is clearly to encourage audited data controllers to be open and transparent with the ICO, and not be punished for such openness. GP practices will not receive an MPN for any contraventions of the DPA discovered during or as a result of a section 41A audit.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Information Commissioner, monetary penalty notice, NHS

Hospital episode data – confidential data uploaded by mistake

Rather hidden away in the new IIGOP annual report is a worrying and revealing report of a serious data breach involving hospital episode data

In February last year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

However, as Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre in June of last year revealed, there had been

lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

As I said at the time

One waits with interest to see whether the [Information Commissioner’s Office (ICO)] will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data

Now, with the launch of the first annual report of the Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott and established at the request of the Secretary of State to “advise, challenge and report on the state of information governance across the health and care system in England”, we see further evidence of HES data “being compromised, the privacy of patients being compromised”. The report informs us of an incident whereby

New inspection procedures introduced by the HSCIC had uncovered a number of organisations which were sending HES data and failing to follow data dictionary standards. This meant they were inadvertently enabling personal confidential data to enter the data base. Following an alert to the Information Commissioners’ Office this was understood as a large scale problem, although having a low level potential impact, as the affected data fields were unknown to either senders or receivers of HES data. The relevant organisations were contacted to gain their cooperation in closing the breach, without alerting any unfriendly observer to the location of the confidential details. This was important to preserve the general ignorance of the detail of the breach and continue to protect individuals’ privacy. Trusts and others were encouraged to provide named contacts who would then start cleaning up their data flows to the HSCIC. In order to manage any untoward reporting in the media, trade titles were informed and briefed about the importance of restricting their reporting to avoid any risk of leading people towards this confidential data.

Now this to me seems pretty serious: those organisations who failed to “follow data dictionary standards” by data controller organisations who were sending HES data sounds very likely to be a contravention of the data controllers’ obligation, under section 4(4) of the Data Protection Act 1998 (DPA) to comply with the seventh data protection principle, which requires that they take

Appropriate technical and organisational measures…against unauthorised or unlawful processing of personal data

Serious contraventions, of a kind likely to cause substantial damage or substantial distress, can result in the ICO serving a monetary penalty notice, under section 55A of the DPA, to a maximum of £500,000.

So, what does one make of these incidents? It’s hard to avoid the conclusion that they would be held to be “serious”, and if the data in question had been misused, there would have been the potential for substantial damage and substantial distress – public disclosure of hospital record data could have a multitude of pernicious effects – and this much is evidenced by the fact that (successful) attempts had to be made to avoid the errors coming to light, including asking journalists to avoid reporting. But were they contraventions likely to cause these things? IIGOP suggests that they had a “low level potential impact” because the data was hidden within large amounts of non-offensive data, and I think it is probably the case that the incidents would not be held to have been likely to cause substantial damage or substantial distress (in Niebel, the leading case on monetary penalty notices, Wikeley J in the Upper Tribunal accepted that the likely in s55A DPA took the same meaning attributed to it by Munby J, in R (Lord) v Secretary of State for the Home Department [2003] EWHC 2073 (Admin), namely “‘likely’ meant something more than ‘a real risk’, i.e. a significant risk, ‘even if the risk falls short of being more probable than not'”).

But a monetary penalty notice is not the only action open to the ICO. He has the power to serve enforcement notices, under s40 DPA, to require data controllers to do, or refrain from doing, specified actions, or to take informal action such as requiring the signing of undertakings (to similar effect). Given that we have heard about these incidents from IIGOP, and in an annual report, it seems unlikely that any ICO enforcement action will be forthcoming. Perhaps that’s correct as a matter of law and as a matter of the exercise of discretion, but in my view the ICO has not been vocal enough about the profound issues raised by the amalgamation and sharing of health data, and the concerns raised by incidents of potentially inappropriate or excessive processing. Care.data of course remains on the agenda, and the IIGOP report is both revealing and encouragingly critical of what has taken place so far, but one would not want a situation to emerge where the ICO took a back seat and allowed IIGOP (which lacks regulatory and enforcement powers) to deal with the issue.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under care.data, Data Protection, data sharing, Information Commissioner, NHS

The wrong test for anonymisation?

UPDATE: 23.01.15 The ICO has responded [.doc file] to my request for a review of their decision. I drew their attention to the arguments on this page but they don’t even mention them, let alone provide a counter-analysis, in dismissing my complaints (“Having reviewed the matter, I agree with the explanations provided”). I am invited by the ICO to consider taking my own legal action. I understand that the ICO and I might have differing views on a DPA matter, but what I find difficult to accept is the refusal even to enter into a discussion with me about the detailed arguments I’ve made. END UPDATE

In February this year I asked the Information Commissioner’s Office (ICO) to investigate reports that Hospital Episode Statistics (HES) data had apparently been sold to an actuarial society by the NHS Information Centre (NHSIC), the predecessor to the Health and Social Care Information Centre (HSCIC). Specifically I requested, as a data subject can under s42 of the Data Protection Act 1998 (DPA), that the ICO assess whether it was likely or not that the processing of my personal data by NHSIC and others had been in compliance with the DPA.

Nine months later, I was still awaiting the outcome. But a clue to how the assessment would turn out was contained in the text of Sir Nick Partridge’s six month review of various data releases by NHSIC (his original report in June seemed to me to point to multiple potential DPA contraventions). In the review document he says

Six investigations have been separately instigated by the HSCIC or Information Commissioner’s Office (ICO)and shared with both parties as these focussed on whether individuals were at risk of being identified. In the cases it has investigated, the ICO has upheld the HSCIC approach and informed us that it has “seen no evidence to suggest that re-identification has occurred or is reasonably likely to occur.”
And sure enough, after chasing the ICO for the outcome of my nine-month wait, I received this (in oddly formatted text, which rather whiffed of a lot of cutting-and-pasting)
Following the recent issue regarding HSCIC, PA Consulting, and Google we investigated the issue of whether HES data could be considered personal data. This detailed work involved contacting HSCIC, PA Consulting, and Google and included the analysis of the processes for the extraction and disclosure of HES data both generally and in that case in particular. We concluded that we did not consider that the HES dataset constitutes personal data.Furthermore we also investigated whether this information had been linked to other data to produce “personal data” which was subject to the provisions of the Act. We have no evidence that there has been any re-identification either on the part of PA Consulting or Google. We also noted that HSCIC have stated that the HES dataset does not include individual level patient data even at a pseudonymised level. Our view is that the data extracted and provided to PA Consulting did not identify any individuals and there was no reasonable likelihood that re-identification would be possible.
I have added the emphasis to the words “reasonable likelihood” above. They appear in similar terms in the Partridge Review, and they struck me as rather odd. An awful lot of analysis has taken and continues to take place on the subject of when can personal data be “rendered fully anonymous in the sense that it is information from which the data subject is no longer identifiable” (Lord Hope’s dicta in Common Services Agency v Scottish Information Commissioner [2008] UKHL 47). Some of that analysis has been academic, some takes the form of “soft law” guidance, for instance Opinion 05/2014 of the Article 29 Working Party, and the ICO Anonymisation Code of Practice. The former draws on the Data Protection Directive 95/46/EC, and notes that

Recital 26 signifies that to anonymise any data, the data must be stripped of sufficient elements such that the data subject can no longer be identified. More precisely, that data must be processed in such a way that it can no longer be used to identify a natural person by using “all the means likely reasonably to be used”

Anonymisation has also been subject to judicial analysis, notably in the Common Services Agency case, but, even more key, in the judgment of Mr Justice Cranston in Department of Health v Information Commissioner ([2011] EWHC 1430). The latter case, involving the question of disclosure of late-term abortion statistics, is by no means an easy judgment to parse (ironically so, given that it makes roughly the same observation of the Common Services Agency case). The judge held that the First-tier Tribunal had been wrong to say that the statistics in question were personal data, but that it had on the evidence been entitled to say that “the possibility of identification by a third party from these statistics was extremely remote”. The fact that the possibility of identification by a third party was extremely remote meant that “the requested statistics were fully anonymised” (¶55). I draw from this that for personal data to be anonymised in statistical format the possibility of identification of individuals by a third party must be extremely remote. The ICO’s Anonymisation Code, however, says of the case:

The High Court in the Department of Health case above stated that the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA [emphasis added]

But this seems to me to be an impermissible description of the case – the High Court did not state what the ICO says it stated – the phrases “greater than remote” and “reasonably likely” do not appear in the judgment. And that phrase “reasonably likely” is one that, as I say, makes it way into the Partridge Review, and the ICO’s assessment of the lawfulness of HES data “sale”.

I being to wonder if the ICO has taken the phrase from recital 26 of the Directive, which talks about the need to consider “all the means likely reasonably to be used” to identify an individual, and transformed it into a position from which, if identification is not reasonably likely, it will accept that data are anonymised. This cannot be right: there is a world of difference between a test which considers whether possibility of identification is “extremely remote” and whether it is “reasonably likely”.

I do not have a specific right to a review of the section 42 assessment decision that the processing of my personal data was likely in compliance with NHSIC’s obligations under the DPA, but I have asked for one. I am aware of course that others complained (après moi, la deluge) notably, in March, FIPR, MedConfidential and Big Brother Watch . I suspect they will also be pursuing this.

In October this year I attended an event at which the ICO’s Iain Bourne spoke. Iain was a key figure in the drawing up of the ICO’s Anonymisation Code, and I took the rather cheeky opportunity to ask about the HES investigations. He said that his initial view was that NHSIC had been performing good anonymisation practice. This reassured me at the time, but now, after considering this question of whether the Anonymisation Code (and the ICO) adopts the wrong test on the risks of identification, I am less reassured. Maybe “reasonably likely that an individual can be identified” is an appropriate test for determining when data is no longer anonymised, and becomes personal data, but it does not seem to me that the authorities support it.

Postscript Back in August of this year I alerted the ICO to the fact that a local authority had published open data sets which enabled individuals to be identified (for instance, social care and housing clients). More than four months later the data is still up (despite the ICO saying they would raise the issue with the council): is this perhaps because the council has argued that the risk of identification is not “reasonably likely”?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under anonymisation, care.data, Data Protection, Directive 95/46/EC, Information Commissioner, NHS

Do your research. Properly

Campaigning group Big Brother Watch have released a report entitled “NHS Data Breaches”. It purports to show the extent of such “breaches” within the NHS. However it fails properly to define its terms, and uses very questionable methodology. I think, most worryingly, this sort of flawed research could lead to a reluctance on the part of public sector data controllers to monitor and record data security incidents.

As I checked my news alerts over a mug of contemplative coffee last Friday morning, the first thing I noticed was an odd story from a Bedfordshire news outlet:

Bedford Hospital gets clean bill of health in new data protection breach report, unlike neighbouring counties…From 2011 to 2014 the hospital did not breach the data protection act once, unlike neighbours Northampton where the mental health facility recorded 346 breaches, and Cambridge University Hospitals which registered 535 (the third worst in the country).

Elsewhere I saw that one NHS Trust had apparently breached data protection law 869 times in the same period, but many others, like Bedford Hospital had not done so once. What was going on – are some NHS Trusts so much worse in terms of legal compliance than others? Are some staffed by people unaware and unconcerned about patient confidentiality? No. What was going on was that campaigning group Big Brother Watch had released a report with flawed methodology, a misrepresentation of the law and flawed conclusions, which I fear could actually lead to poorer data protection compliance in the future.

I have written before about the need for clear terminology when discussing data protection compliance, and of the confusion which can be caused by sloppiness. The data protection world is very found of the word “breach”, or “data breach”, and it can be a useful term to describe a data security incident involving compromise or potential compromise of personal data, but the confusion arises because it can also be used to describe, or assumed to apply to, a breach of the law, a breach of the Data Protection Act 1998 (DPA). But a data security incident is not necessarily a breach of a legal obligation in the DPA: the seventh data protection principle in Schedule One requires that

Appropriate technical and organisational measures shall be taken [by a data controller] against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data

And section 4(4) of the DPA obliges a data controller to comply with the Schedule One data protection principles. This means that when appropriate technical and organisational measures are taken but unauthorised or unlawful processing, or accidental loss or destruction of, or damage to, personal data nonetheless occurs, the data controller is not in breach of its obligations (at least under the seventh principle). This distinction between a data security incident, and a breach, or contravention, of legal obligations, is one that the Information Commissioner’s Office (ICO) itself has sometimes failed to appreciate (as the First-tier Tribunal found in the Scottish Borders Council case EA/2012/0212). Confusion only increases when one takes into account that under The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) which are closely related to the DPA, and which deal with data security in – broadly – the telecoms arena, there is an actual legislative provision (regulation 2, as amended) which talks in terms of a “personal data breach”, which is

a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed in connection with the provision of a public electronic communications service

and regulation 5A obliges a relevant data controller to inform the ICO when there has been a “personal data breach”. It is important to note, however, that a “personal data breach” under PECR will not be a breach, or contravention, of the seventh DPA data protection principle, provided the data controller took appropriate technical and organisational to safeguard the data.

Things get even more complex when one bears in mind that the draft European General Data Protection Regulation proposes a similar approach as PECR, and defines a “personal data breach” in similar terms as above (simply removing the words “in connection with the provision of a public electronic communications service“).

Notwithstanding this, the Big Brother Watch report is entitled “NHS Data Breaches”, so one would hope that it would have been clear about its own terms. It has led to a lot of coverage, with media outlets picking up on headline-grabbing claims of “7225 breaches” in the NHS between 2011 and 2014, which is the equivalent to “6 breaches a day”. But when one looks at the methodology used, serious questions are raised about the research. It used Freedom of Information requests to all NHS Trusts and Bodies, and the actual request was in the following terms

1. The number of a) medical personnel and b) non-medical personnel that have been convicted for breaches of the Data Protection Act.

2. The number of a) medical personnel and b) non-medical personnel that have had their employment terminated for breaches of the Data Protection Act.

3. The number of a) medical personnel and b) non-medical personnel that have been disciplined internally but have not been prosecuted for breaches of the Data Protection Act.

4. The number of a) medical personnel and b) non-medical personnel that have resigned during disciplinary procedures.

5. The number of instances where a breach has not led to any disciplinary action.

The first thing to note is that, in broad terms, the only way that an individual NHS employee can “breach the Data Protection Act” is by committing a criminal offence under section 55 of unlawfully obtaining personal data without the consent of the (employer) data controller. All the other relevant legal obligations under the DPA are ones attaching to the NHS body itself, as data controller. Thus, by section 4(4) the NHS body has an obligation to comply with the data protection principles in Schedule One of the DPA, not individual employees. And so, except in the most serious of cases, where an employee acts without the consent of the employer to unlawfully obtain personal data, individual employees, whether medical or non-medical personnel, cannot as a matter of law “breach the Data Protection Act”.

One might argue that it is easy to infer that what Big Brother Watch meant to ask for was information about the number of times when actions of individual employees meant that their employer NHS body had breached its obligations under the DPA, and, yes, that it probably what was meant, but the incorrect terms and lack of clarity vitiated the purported research from the start. This is because NHS bodies have to comply with the NHS/Department of Health Information Governance Toolkit. This toolkit actually requires NHS bodies to record serious data security incidents even where those incidents did not, in fact, constitute a breach of the body’s obligations under the DPA (i.e. incidents might be recorded which were “near misses” or which did not constitute a failure of the obligation to comply with the seventh, data security, principle).

The results Big Brother Watch got in response to their ambiguous and inaccurately termed FOI request show that some NHS bodies clearly interpreted it expansively, to encompass all data security incidents, while others – those with zero returns in any of the fields, for instance – clearly interpreted it restrictively. In fact, in at least one case an NHS Trust highlighted that its return included “near misses”, but these were still categorised by Big Brother Watch as “breaches”.

And this is not unimportant: data security and data protection are of immense importance in the NHS, which has to handle huge amounts of highly sensitive personal data, often under challenging circumstances. Awful contraventions of the DPA do occur, but so too do individual and unavoidable instances of human error. The best data controllers will record and act on the latter, even though they don’t give rise to liability under the DPA, and they should be applauded for doing so. Naming and shaming NHS bodies on the basis of such flawed research methodology might well achieve Big Brother Watch’s aim of publicising its call for greater sanctions for criminal offences, but I worry that it might lead to some data controllers being wary of recording incidents, for fear that they will be disclosed and misinterpreted in the pursuit of questionable research.

1 Comment

Filed under Data Protection, Freedom of Information, Information Commissioner, NHS

Monitoring of blogs and lawful/unlawful surveillance

Tim Turner wrote recently about the data protection implications of the monitoring of Sara Ryan’s blog by Southern Health NHS Trust. Tim’s piece is an exemplary analysis of how the processing of personal data which is in the public domain is still subject to compliance with the Data Protection Act 1998 (DPA):

there is nothing in the Data Protection Act that says that the public domain is off-limits. Whatever else, fairness still applies, and organisations have to accept that if they want to monitor what people are saying, they have to be open about it

But it is not just data protection law which is potentially engaged by the Trust’s actions. Monitoring of social media and networks by public authorities for the purposes of gathering intelligence might well constitute directed surveillance, bringing us explicitly into the area of human rights law. Sir Christopher Rose, the Chief Surveillance Commissioner said, in his most recent annual report

my commissioners remain of the view that the repeat viewing of individual “open source” sites for the purpose of intelligence gathering and data collation should be considered within the context of the protection that RIPA affords to such activity

“RIPA” there of course refers to the complex Regulation of Investigatory Powers Act 2000 (RIPA) (parts of which were reputedly “intentionally drafted for maximum obscurity”)1. What is not complex, however, is to note which public authorities are covered by RIPA when they engage in surveillance activities. A 2006 statutory instrument2 removed NHS Trusts from the list (at Schedule One of RIPA) of relevant public authorities whose surveillance was authorised by RIPA. Non-inclusion on the Schedule One lists doesn’t as a matter of fact or law mean that a public authority cannot undertake surveillance. This is because of the rather odd provision at section 80 of RIPA, which effectively explains that surveillance is lawful if carried out in accordance with RIPA, but surveillance not carried out in accordance with RIPA is not ipso facto unlawful. As the Investigatory Powers Tribunal put it, in C v The Police and the Home Secretary IPT/03/32/H

Although RIPA provides a framework for obtaining internal authorisations of directed surveillance (and other forms of surveillance), there is no general prohibition in RIPA against conducting directed surveillance without RIPA authorisation. RIPA does not require prior authorisation to be obtained by a public authority in order to carry out surveillance. Lack of authorisation under RIPA does not necessarily mean that the carrying out of directed surveillance is unlawful.

But it does mean that where surveillance is not specifically authorised by RIPA questions would arise about its legality under Article 8 of the European Convention on Human Rights, as incorporated into domestic law by the Human Rights Act 1998. The Tribunal in the above case went on to say

the consequences of not obtaining an authorisation under this Part may be, where there is an interference with Article 8 rights and there is no other source of authority, that the action is unlawful by virtue of section 6 of the 1998 Act.3

So, when the Trust was monitoring Sara Ryan’s blog, was it conducting directed surveillance (in a manner not authorised by RIPA)? RIPA describes directed surveillance as covert (and remember, as Tim Turner pointed out – no notification had been given to Sara) surveillance which is “undertaken for the purposes of a specific investigation or a specific operation and in such a manner as is likely to result in the obtaining of private information about a person (whether or not one specifically identified for the purposes of the investigation or operation)” (there is a further third limb which is not relevant here). One’s immediate thought might be that no private information was obtained or intended to be obtained about Sara, but one must bear in mind that, by section 26(10) of RIPA “‘private information’, in relation to a person, includes any information relating to his private or family life” (emphasis added). This interpretation of “private information” of course is to be read alongside the protection afforded to the respect for one’s private and family life under Article 8. The monitoring of Sara’s blog, and the matching of entries in it against incidents in the ward on which her late son, LB, was placed, unavoidably resulted in the obtaining of information about her and LB’s family life. This, of course, is the sort of thing that Sir Christopher Rose warned about in his most recent report, in which he went on to say

In cash-strapped public authorities, it might be tempting to conduct on line investigations from a desktop, as this saves time and money, and often provides far more detail about someone’s personal lifestyle, employment, associates, etc. But just because one can, does not mean one should.

And one must remember that he was talking about cash-strapped public authorities whose surveillance could be authorised under RIPA. When one remembers that this NHS Trust was not authorised to conduct directed surveillance under RIPA, one struggles to avoid the conclusion that monitoring was potentially in breach of Sara’s and LB’s human rights.

1See footnote to Caspar Bowden’s submission to the Intelligence and Security Committee
2The Regulation of Investigatory Powers (Directed Surveillance and Covert Human Intelligence Sources) (Amendment) Order 2006
3This passage was apparently lifted directly from the explanatory notes to RIPA

3 Comments

Filed under Data Protection, human rights, NHS, Privacy, RIPA, social media, surveillance, surveillance commissioner

The Partridge Review reveals apparently huge data protection breaches

Does the Partridge Review of NHS transfers of hospital episode patient data point towards one of the biggest DPA breaches ever?

In February this year Tim Kelsey, NHS England’s National Director for Patients and Information, and vocal cheerleader for the care.data initiative, assured the public, in an interview on the Radio 4 Today programme, that in the twenty five years that Hospital Episode Statistics (HES) have been shared with other organisations

the management of the hospital episode database…there has never been a single example of that data being compromised, the privacy of patients being compromised…

When pressed by medConfidential‘s Phil Booth about this, and about risks of reidentification from the datasets, Tim repeated that no patient’s privacy had been compromised.

Some of us doubted this, as news of specific incidents of data loss emerged, and even more so as further news emerged suggesting that there had been transfers (a.k.a. sale) of huge amounts of potentially identifiable patient data to, for instance, the Institute and Faculty of Actuaries. The latter news led me to ask the Information Commissioner’s Office (ICO) to assess the lawfulness of this processing, an assessment which has not been completed four months later.

However, with the publication on 17 June of Sir Nick Partridge’s Review of Data Releases by the NHS Information Centre one questions the basis for Tim’s assertions. Sir Nick commissioned PwC to analyse a total of 3059 data releases between 2005 and 2013 (when the NHS Information Centre (NHSIC) ceased to exist, and was replaced by the Health and Social Care Information Centre HSCIC). The summary report to the Review says that

It disappoints me to report that the review has discovered lapses in the strict arrangements that were supposed to be in place to ensure that people’s personal data would never be used improperly

and it reveals a series of concerning and serious failures of data governance, including

  • lack of detailed records between 1 April 2005 and 31 March 2009
  • two cases of data that was apparently released without a proper record remaining of which organisation received the data
  • [no] evidence that Northgate [the NHSIC contractor responsible for releases] got permission from the NHS IC before making releases as it was supposed to do
  • PwC could not find records to confirm full compliance in about 10% of the sample

 Sir Nick observes that

 the system did not have the checks and balances needed to ensure that the appropriate authority was always in place before data was released. In many cases the decision making process was unclear and the records of decisions are incomplete.

and crucially

It also seems clear that the responsibilities of becoming a data controller, something that happens as soon as an organisation receives data under a data sharing agreement, were not always clear to those who received data. The importance of data controllers understanding their responsibilities remains vital to the protection of people’s confidentiality

(This resonates with my concern, in my request to the ICO to assess the transfer of data from HES to the actuarial society, about what the legal basis was for the latter’s processing).

Notably, Sir Nick dispenses with the idea that data such as HES was anonymised:

The data provided to these other organisations under data sharing agreements is not anonymised. Although names and addresses are normally removed, it is possible that the identity of individuals may be deduced if the data is linked to other data

 And if it was not anonymised, then the Data Protection Act 1998 (DPA) is engaged.

All of this indicates a failure to take appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data, which the perspicacious among you will identify as one of the key statutory obligations placed on data controllers by the seventh data protection principle in the DPA.

Sir Nick may say

 It is a matter of fact that no individual ever complained that their confidentiality had been breached as a result of data being shared or lost by the NHS IC

but simply because no complaint was made (at the time – complaints certainly have been made since concerns started to be raised) does not mean that the seventh principle was not contravened, in a serious way.  And a serious contravention of the DPA of a kind likely to cause substantial damage or substantial distress can potentially lead to the ICO serving a monetary penalty notice (MPN) to a maximum of £500,000 (at least for contraventions after April 2010, when the ICO’s powers commenced).

The NHSIC is no more (although as Sir Nick says, HSCIC “inherited many of the NHS IC’s staff and procedures”). But that has not stopped the ICO serving MPNs on successor organisation in circumstances where their predecessors committed the contravention.  One waits with interest to see whether the ICO will take any enforcement action, but I think it’s important that they consider doing so, because, even though Sir Nick makes nine very sensible recommendations to HSCIC, one could be forgiven – having been given clear assurances previously, by the likes of Tim Kelsey and others – for having reservations as to future governance of our confidential medical data. I would suggest it is imperative that HSCIC know that their processing of personal data is now subject to close oversight by all relevant regulatory bodies.

 

 

 

 

 

 

 

 

 

2 Comments

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, monetary penalty notice, NHS, Privacy

Articles on care.data

I thought I was rather flogging the care.data horse on this blog, so, in the spirit of persistence, I thought why not go and do it somewhere else? The Society of Computers and Law kindly asked me to write a broadly “anti” piece, while asking Martin Hoskins to do a broadly “pro” one. They are here:

Care.data the Cons
Care.data the Pros

I am pleased to announce that Martin and I are still on speaking terms.

Leave a comment

Filed under care.data, Data Protection, data sharing, NHS

Opting patients out of care.data – in breach of data protection law?

The ICO appear to think that GPs who opt patients out of care.data without informing them would be breaching the Data Protection Act.  They say it would be unfair processing

In February of this year GP Dr Gordon Gancz was threatened with termination of his contract, because he had indicated he would not allow his patients’ records to be uploaded to the national health database which as planned to be created under the care.data initiative. He was informed that if he didn’t remove information on his website, and if he went on to add “opt-out codes” to patients’ electronic records, he would be in breach of the NHS (GMS contract) Regulations 2004. Although this threatened action was later withdrawn, and care.data put on hold for six months, Dr Gancz might have been further concerned to hear that in the opinion of the Information Commissioner’s Office (ICO) he would also have been in breach of the Data Protection Act 1998 (DPA).

A few weeks ago fellow information rights blogger Tim Turner (who has given me permission to use the material) asked NHS England about the basis for Health Services Minister Dan Poulter’s statement in Parliament that

NHS England and the Health and Social Care Information Centre will work with the British Medical Association, the Royal College of General Practitioners, the Information Commissioner’s Office and with the Care Quality Commission to review and work with GP practices that have a high proportion of objections [to care.data] on a case-by-case basis

Tim wanted to know what role the ICO would play. NHS England replied saying, effectively, that they didn’t know, but they did disclose some minutes of a meeting held with the ICO in December 2013. Those minutes indicate that

The ICO had received a number of enquiries regarding bulk objections from practices. Their view was that adding objection codes would constitute processing of data in terms of the Data Protection Act.  If objection codes had been added without writing to inform their patients then the ICO’s view was that this would be unfair processing and technically a breach of the Act so action could be taken by the ICO

One must stress that this is not necessarily a complete or accurate respresentation of the ICO’s views. However, what appears to be being said here is that, if GPs took the decision to “opt out” their patients from care.data, without writing to inform them, this would be an act of “processing” according to the definition at section 1(1) of the DPA, and would not be compliant with the GPs’ obligations under the first DPA principle to process personal data fairly.

On a very strict reading of the DPA this may be technically correct – for processing of personal data to be fair data subjects must be informed of the purposes for which the data are being processed, and, strictly, adding a code which would prevent an upload (which would otherwise happen automatically) would be processing of personal data. And, of course, the “fairness” requirement is absent from the proposed care.data upload, because Parliament, in its wisdom, decided to give the NHS the legal power to override it. But “fairness” requires a broad brush, and the ICO’s interpretation here would have the distinctly odd effect of rendering unlawful a decision to maintain the status quo whereby patients’ GP data does not leave the confidential confines of their surgery. It also would have the effect of supporting NHS England’s apparent view that GPs who took such action would be liable to sanctions.

In fairness (geddit???!!) to the ICO, if a patient was opted out who wanted to be included in the care.data upload, then I agree that this would be in breach of the first principle, but it would be very easily rectified, because, as we know, it will be simple to opt-in to care.data from a previous position of “opt-out”, but the converse doesn’t apply – once your data is uploaded it is uploaded in perpetuity (see my last bullet point here).

A number of GPs (and of course, others) have expressed great concern at what care.data means for the confidential relationship between doctor and patient, which is fundamental for the delivery of health care. In light of those concerns, and in the absence of clarity about the secondary uses of patient data under care.data, would it really be “unfair” to patients if GPs didn’t allow the data to be collected? Is that (outwith DPA) fair to GPs?

Leave a comment

Filed under care.data, Confidentiality, Data Protection, data sharing, Information Commissioner, NHS