Category Archives: Data Protection

Contributing to society?

Why are proponents of care:data resorting to rudeness about those who are not as convinced as they are?

When I attended the launch of MedConfidential in April of this year I was largely ignorant of the proposals to amass patient data by the Health and Social Care Information Centre (HSCIC) under the banner of care:data. I was concerned by what I heard, and I remain so: details were unclear and in many cases remain so, regarding what data will be gathered, and how, and for what purposes, and what arrangements will be to allow third party access to it, and whether or to what extent it will be anonymised, and whether patients’ consent will be sought, or assumed, or ignored.

What I did see, and was greatly impressed by, was a large group of people, from various backgrounds and roles, coming together, mostly on a purely voluntary basis (for instance, I took a day’s leave to attend), to discuss the implications of this.

The centralising and use of patient confidential data raises questions of profound importance, which don’t have easy answers: such as to what extent should people waive an expectation of privacy in order – for instance – to further medical research? These are issues which led two of my favourite bloggers to come to (digital) blows recently.

Yet earlier today I read an otherwise sensible piece on the subject (I am not saying I agree with it) by the high-profile columnist Polly Toynbee, which talked about her receiving letters from people who ask her to

investigate the dark forces planting cameras and microphones in their walls: they think I’m part of the conspiracy when I suggest this is a usually curable delusion, and their doctor is probably not part of the plot

I fail to see the relevance of this reference to people with a diagnosis of apparent paranoid schizophrenia, unless it is to draw an analogy by insinuation with

those not clinically ill [among whom] there is a growing trend to fear Big Brother and the state

This is nasty stuff, and leads one to wonder why she feels the need to resort to such a rhetorical device.

Someone who liked Toynbee’s post was Tim Kelsey, NHS National Director for Patients and Information, and former government “czar” for Transparency and Open Data. He described it as “seminal” on twitter. I’m sure Tim finds the constant questioning of the care:data plans irritating: his tweets are often replied to by people who are not as convinced as he is that it is unequivocally a Good Thing. An example of this irritation was his response to an observation by Calderdale councillor James Baker. James tweeted, in response to Tim’s “seminal” tweet

I don’t think using people’s data for research purposes without informed consent is ‘good for science’

This is unexceptional, and a fair comment. Tim’s reply* was certainly not

you can object and your data will not be extracted and you can make no contribution to society

I think that to suggest that someone who might object (in the context of a worrying lack of, er, transparency, about the details of care:data) to the extraction of their highly sensitive medical data is making “no contribution to society” is extraordinarily unfair, and, as James pointed out in reply

It’s an offensive thing to say to an elected representative who contributes a lot to society…It’s also using trying to use guilt and shame to persuade someone to partake in medical research. Unethical

I couldn’t agree more.

UPDATE:

*It appears the tweet has now been deleted. Tim did reply to James saying

offence not intended – I meant contribution to health improvement thru sharing non PID

but there’s been no explanation or apology for that original tweet

20130823-174459.jpg

3 Comments

Filed under Data Protection, NHS, Privacy, transparency

Pivot tables and databreaches

About a year ago I first became aware of reports of disturbing inadvertent disclosures of personal data (often highly sensitive) by public authorities who had intended only to disclose anonymous and/or aggregate data. These incidents were occurring both in the context of disclosures under the Freedom of Information Act 2000 (FOIA) and in the context of proactive disclosure of datasets. Mostly they were when what had been disclosed was not just raw data, but the spreadsheet in which the data was presented. Spreadsheet software is often very powerful, and not all users necessarily understand its capabilities (I don’t think I do). By use of pivot tables data can be sorted, summarised etc, but also, from the uninitiated or unwary, hidden. If the person who created or maintained a spreadsheet containing a pivot table is not involved in the act of publicly disclosing it it is possible that an apparently innocuous disclosure will contain hidden personal data.

Clearly such errors are likely to constitute breaches – sometimes very serious breaches – of the Data Protection Act 1998 (DPA) Those of us who were aware of a number of these inadvertent breaches were also aware that, if public authorities were not alerted to the risk a) the practice would continue and b) potentially large numbers of “disclosive” datasets would remain out in the open (in disclosure logs, on WhatDoTheyKnow, in open data sets etc). But we were also aware that, if the situation was not managed well and quietly, with authorities given the opportunity to correct/withdraw errors, inquisitive or even malicious sorts might go trawling open datasets for disclosures which could potentially be very damaging and distressing to data subjects.

It was with some relief, therefore that, following an earlier announcement by WhatDoTheyKnow, the Information Commissioner’s Office (ICO) finally gave a warning, and good guidance, on 28 June (although this relief was tempered by finding out, via Tim Turner, that the ICO had known about, and apparently done nothing about, the problem for three years). At the same time the ICO announced that it was “actively considering a number of enforcement cases on this issue”.

It appears that, according to an announcement on its own website, Islington Council is the first recipient of this enforcement. The Council says it has

accepted a £70,000 fine from the Information Commissioner’s Office (ICO) after a mistake led to personal data being released

after it

responded to a Freedom of Information (FOI) request asking for information including the ethnicity and gender of people the council had rehoused. The response, in the form of Excel spreadsheet tables, included personal information concealed behind the summary tables

Fair play to Islington for acknowledging this and agreeing immediately to pay the monetary penalty notice. And if some of the other reported breaches I heard about were as bad as they sounded £70,000 will be at the lower end of the scale.

(thanks to @owenboswarva on twitter for flagging this up)

UPDATE:

The ICO has now posted details of the MPN, and this clarifies that the disclosure was made on WhatDoTheyKnow and was only identifed when one of their site administrators noticed it.

Leave a comment

Filed under Breach Notification, Data Protection, Freedom of Information, Information Commissioner, monetary penalty notice, transparency

Monetary penalties – focus on the breach, not the incident

The Information Tribunal’s judgment in the successful appeal by Scottish Borders Council shows that the ICO needs to focus on the contravention itself, not an incident which might arise from it

looking at the facts of the case, what did happen was in our view a surprising outcome, not a likely one

Sections 55A-E of the Data Protection 1998 (DPA), inserted by the Criminal Justice and Immigration Act 2008, provide for the Information Commissioner (IC) to serve a data controller with a monetary penalty notice (MPN) to a maximum of £500,000 if

  • he is satisfied that there has been a serious contravention of the controller’s obligations to comply with the data protection principles in Schedule One of the DPA, and
  • the contravention was of a kind likely to cause substantial damage or substantial distress, and
  • the contravention was either deliberate or the controller either knew or ought to have known that there was a risk that the contravention of its occurring and that it would be of a kind likely to cause substantial damage or substantial distress, but failed to take reasonable steps to prevent the contravention.

In its judgment, handed down today, on what is effectively* a successful appeal by Scottish Borders Council, the First-tier Tribunal (Information Rights) (“FTT”) has given guidance on, what is required in order for the IC to be satisfied that a serious contravention was likely to cause substantial damage or substantial distress. In particular, the FTT has clarified that, where the DPA talks about a “serious contravention”, the IC must focus on that, and not on any incident which might follow.

The Monetary Penalty Notice

The events giving rise to the original MPN (still currently on the IC’s website) are laid out by the FTT in the first two paragraphs of the judgment

Outside Tesco in South Queensferry there are some bins for recycling waste paper. They are of the “post box” type. On 10 September 2011 a member of the public found that one of the bins was overflowing. The material at the top, easily accessible, consisted of files containing pension records kept by a local authority (“Scottish Borders”). It turned out that a data processing company had transferred the information from hard copy files to CDs at Scottish Borders’ request. The data processor had then disposed of about 1,600 manual files in the post box bins at Tesco and at another supermarket in the town.

The police took into their possession all those files which they could reach. They then secured the bins and, with the cooperation of Scottish Borders, it was ascertained that the files concerned had now either been pulped without manual intervention or were now back in the safe keeping of the council.

The IC imposed an MPN of £250,000, finding that there had been a serious contravention of the obligation to comply with the seventh data protection principle (DPP7) which states that

Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.

and that, where, as here, processing of personal data is carried out by a data processor on behalf of a data controller, the latter must choose as the former one who provides sufficient guarantees in respect of its data security measures, and ensure that such processing is carried out under a suitable written contract (I paraphrase).

The contravention here was the failure by the Council to ensure that it engaged an appropriate data processor (to dispose of the pensions records) in an appropriate way (by means of an adequate contract, properly monitored and adequately evidenced in writing).

The IC said that contravention was likely to cause substantial damage or substantial distress (query, which?) to those whose confidential data was seen by a member of the public and that

If the data has been disclosed to untrustworthy third parties then it is likely that the contravention would cause further distress and also substantial damage to the data subjects such as exposing them to identity fraud and possible financial loss

Arguments and findings

The FTT found that there was a contravention. The Council had a long-standing (some 25-30 years) agreement with the data processor but it appears that the contractual arrangement was largely based on informal agreements and assurances. Although it was to an extent evidence in writing, this was still inadequate. Accordingly

the arrangements made by Scottish Borders for processing pension records in July and August 2011 were in contravention of the DPA

Further, the FTT was satisfied that the contravention was serious

the duties in relation to data processing contracts in paras 11 and 12 of schedule 1 are at the heart of the system for protecting personal data under DPA. It is fundamental that the data controller cannot be allowed to contract out its responsibilities [and] the contravention was not an isolated human error. It was systemic

However, counsel for the IC, the redoubtable Robin Hopkins, reminded the FTT that they must focus on the contravention which gave rise to the MPN. In this case, this was distinguishable from the events described in the first two paragraphs of the judgment: the contravention was the breach of DPP7, not the discovery of the data. On this basis, the FTT did not accept that the contravention had been of a kind likely to cause substantial damage or substantial distress. Evidence was taken from David Smith, Deputy IC, and the IC developed an argument focusing on the risks of identity theft, but the FTT seems to have felt that the evidence was either unconvincing (regarding the likelihood of identity theft) or still focused wrongly on what it calls the “trigger point” (the disposal/finding of the files in the bin) rather than the contravention itself. As to the latter

it seems to us that the fact that the data processor was a specialist contractor with a history of 25-30 years of dealings with Scottish Borders carries weight. He was no fly by night. The council had good reason to trust the company.

And, therefore

Focussing on the contravention we have been unable to construct a likely chain of events which would lead to substantial damage or substantial distress. What did happen was of course startling enough. Again, though, looking at the facts of the case, what did happen was in our view a surprising outcome, not a likely one.

This illustrates a fundamental point, but one, it seems, of great significance. It will, no doubt, be seized upon eagerly by any data controller in receipt of a notice of intent to serve an MPN. (It was also, I should acknowledge, anticipated by observations by Tim Turner and Andrew Walsh, both former ICO employees). However, the FTT do stress that although this case did not involve a contravention of a kind likely to cause substantial damage or substantial distress

No doubt some breaches of the seventh DPP in respect of some data might be of such a kind

What now?

I said earlier this was “effectively a successful appeal”. It was in fact an appeal on a preliminary issue (on the liability of the Council to pay an MPN) and under the Data Protection (Monetary Penalties) Order 2010 the FTT may either allow the appeal or substitute such other notice or decision which could have been served or made by the IC. The FTT’s concerns about the Council’s procedures in relation to data processing contracts were “too serious” for them simply to allow the appeal, and they are – pending discussions between the IC and the Council – considering whether to issue an enforcement notice.

Notwithstanding the outcome of those discussions, this is an important judgment to be read alongside the unsuccessful MPN appeal by the Central London Community Healthcare NHS Trust. Until an MPN case gets appealed further we will not have binding authority, but the lines are perhaps becoming a bit clearer for data controllers, and, indeed for the ICO.

There were some interesting comments and observations by the FTT on “other issues canvassed in the course of [the] appeal but which it has not been necessary to resolve”. I hope to post a follow-up about these in due course.

Leave a comment

Filed under Data Protection, enforcement, Information Commissioner, Information Tribunal, monetary penalty notice

Data Protection audits in the NHS

Do the results of an anonymous survey into data protection practices and attitudes of junior doctors provide justification for compulsory audits?

Continue reading

4 Comments

Filed under Data Protection, Information Commissioner, NHS

An unshared perspective

Paul Gibbons, FOI Man, has blogged about data-sharing, questioning whether an over-cautious approach to sharing of health data is damaging. Paul says

What I’m increasingly worried about is what appears to be a widely held and instinctive view that any sharing of personal data – and even data that has been anonymised – is necessarily a “bad thing”.

I’ve got to say, in all the time I’ve worked in the field of information rights I’ve never come across anyone who actually thinks that, let alone articulates it (in my experience the only people who say it are those who seek to misrepresent it). The Data Protection Act 1998 (DPA) and EC Directive 95/46/EC to which it gives effect do not act as a default bar to sharing of data. There may be circumstances under which compliance with the law means that sharing of personal data cannot happen, but the converse is true – there will be times when sharing is lawful, necessary and proportionate.

Paul’s prime example of what he sees as (to adopt the title of his piece) “a disproportionate fear of ‘Big Brother’” preventing us from seeing the big picture” is the “predictable outcry” about the care:data programme, whereby the Health and Social Care Information Centre will, through the exercise of certain provisions in the Health and Social Care Act 2012, extract enormous amounts of health and social care information from local systems to centralised ones. The first step in this is the GP Extraction Service (GPES) whereby information relating to medical conditions, treatments and diagnosis, with each patient’s NHS number, date of birth, postcode, gender, ethnicity and other information will be uploaded routinely. The information will then be made available to a range of organisations, sometimes including private companies, sometimes in ostensibly anonymised, sometimes in identifiable, form, for a variety of purposes. This will happen to your medical records unless you opt-out (and if you think you’ve already done so, you probably haven’t – those who objected to the creation of a summary care record will have to go through another opt-out process). And this week we were informed that there will be no national campaign to alert patients to the GPES – the responsibility (and liability) will lie with GP practices themselves. (Anyone wanting to understand this complex and less-than-transparent process must read and follow the superb MedConfidential).

I accept that, on one view, this amassing of health and social care data could be seen as a good thing: as Paul suggests, medical research, for instance is a hugely important area. And the NHS Commissioning Board identifies the following desired outcomes from care:data

– support patients’ choice of service provider and treatment by making comparative data publicly available
– advance customer services, with confidence that services are planned around the patient
– promote greater transparency, for instance in support of local service planning
– improve outcomes, by monitoring against the Outcomes Frameworks
– increase accountability in the health service by making data more widely available
– drive economic growth through the effective use of linked data

But how realistic are these? And what are the attendant risks or detriments? Paul says

central medical records for all NHS patients…would mean that when you turned up at a hospital far from home, as I have done myself, doctors would have access to your medical records and history. Believe me, when you are in pain and desperate to be treated, the last thing that you want to do is to answer questions about your medical history

With great respect, the ideal of a centralised system whereby medics can provide emergency treatment to patients by accessing electronic records is never going to be more than a myth. Put another way – would Paul be happy trusting his life to the accuracy of an electronic record that might or might not say, for instance, whether he is allergic to aspirin? Treatment of patients is a matter of diagnosis, and emergency diagnoses will never be made solely, if at all, on the basis of records.

Security of information, and risks of identification of individuals are other key concerns. Paul says Daniel Barth-Jones identifies “deficiencies in [reidentification] studies” but I think what Barth-Jones is actually arguing is that the risks of reidentification are real, but they must be accurately reported and balanced against the likelihood of their happening.

But ultimately I have two major conceptual concerns about care:data and what it implies. The first is that, yes, I am instinctively distrusting of agglomeration of sensitive personal data in identifiable form in mass processing systems: history has taught us to be this way so I don’t see this, as Paul appears to, as a “fashionable” mistrust (and, for instance, the Joseph Rowntree Foundations’ exemplary Database State report is now over six years old). The second is that patient-medic confidentiality exists, and has existed for a very long time, for a reason: if patients are not certain that their intimate medical details are confidential, they might be reluctant to speak candidly to their doctor. In fact, they might not even visit their doctor at all.

3 Comments

Filed under Confidentiality, Data Protection, data sharing, human rights, Let's Blame Data Protection

Data Protection Act: little-known, well-known

According to Lord Justice Leveson

The UK data protection regime suffers from an unenviable reputation, perhaps not wholly merited, but nevertheless important to understand at the outset. To say that it is little known or understood by the public, regarded as a regulatory inconvenience in the business world, and viewed as marginal and technical among legal practitioners (including by our higher courts), might be regarded as a little unfair by the more well-informed, but is perhaps not so far from the truth. [page 999, of report of An inquiry into the culture, practices and ethics of the press]
But I’m not sure (thank to Gary Slapper for pointing this out)
dpa
And in fairness to Brian, he does go on to say
And yet the subject-matter of the data protection regime, how personal information about individuals is acquired, used and traded for business purposes, could hardly be more fundamental to issues of personal integrity, particularly in a world of ever-accelerating information technology capability [ibid]
Perhaps this is why data protection and its practical application appeal so much to some odd people, and why it is our littlest-known-most-requested piece of legislation.

Leave a comment

Filed under Data Protection, Leveson

Take the train(ing)

IG policies are essential, but not much use if you don’t comply with them

In NHS and Social Care settings a standard requirement is that all staff are trained in information governance (a large component of which is data protection): “Information Governance awareness and mandatory training procedures are in place and all staff are appropriately trained” (IG Toolkit v11) and “Ensure all staff are trained, updated and aware of their responsibilities” (Local Government Data Handling Guidelines). If an organisation suffers a serious breach of data security, and the Information Commissioner’s Office (ICO) investigates, one of the first things they will look at is whether staff were appropriately trained. If they weren’t, enforcement action, possibly in the form of a monetary penalty notice, is highly likely.

It is vital, therefore, that all organisations have a policy that all relevant staff are trained (and in some organisations – like the NHS and local authorities – that will normally mean all staff).

But, policies only work if they are implemented, enforced and monitored. The ICO has recently published an Undertaking (the “last chance saloon” before formal enforcement action) signed by the Northern Health and Social Care Trust. This arose following an incident which

involved confidential service user information being faxed from a ward in Antrim Hospital to a local business in error. The information was intended for the Trust’s Community Rehabilitation Team. The referral form contained sensitive clinical data

Although the Trust had a “fax policy” (good) it wasn’t complied with (bad) but also 

The Commissioner’s investigation into the Trust revealed that despite the Trust having introduced what should have been mandatory Information Governance training for all staff, the majority of staff involved in these incidents had not received this training. This highlighted a potentially serious failing in respect of staff awareness of Information Governance policies. In particular, the failure to monitor and enforce staff completion of training was a concern.

This failure constituted a breach of the seventh data protection principle (“Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data”). It is highly likely that, if training requirements had been complied with, no action would have been (or would have been able to be) taken, because there would have been no breach.

Put simply, if a data controller can show it has complied with the seventh data protection principle, and there is an accidental data security breach – however horrendous – then (providing there are no breaches of other principles) no sanctions will arise.

It’s in every data controller’s interests not only to require appropriate data protection training for staff, but also to ensure that it has been taken.

Leave a comment

Filed under Data Protection, employment, Information Commissioner, monetary penalty notice

Poor judgement?

Public authorities need to be cautious when disclosing performance figures of their staff under Freedom of Information (FOI) laws. They need to be even more cautious when disclosing performance figures of third parties.

Imagine if your employer, or, worse, a third party, disclosed under FOI that, of all your peers, you made the most decisions in the exercise of your employment which were subsequently found to be wrong, and which had to be overturned. If in fact those figures turned out to be incorrect, you would probably rightly feel aggrieved, and perhaps question whether the failure of data quality was in fact a breach of your rights under the Data Protection Act 1998 (DPA) and of your employment rights.

That is what appears to have happened to certain judges in Scotland, according to a letter in The Scotsman today, from the Chief Executive of the Scottish Court Service. The letter points out that a previous (29 July) article in The Scotsman – “Meet the judge with the highest number of quashed convictions” (now no longer available, for obvious reasons) – was, although published in good faith, based on inaccurate information disclosed to the paper under FOI. The letter contains an apology to

Lord Carloway and Lord Hardie, who featured prominently in 
this article, for misrepresenting their position in relation to 
appeal decisions

because the erroneous disclosed statistics suggested they had had more judgments overturned on appeal than was actually the case.

Of course, the principle of judicial independence means that judges are, strictly, not employed. But as Carswell LCJ said

All judges, at whatever level, share certain common characteristics. They all must enjoy independence of decision without direction from any source, which the respondents quite rightly defended as an essential part of their work. They all need some organisation of their sittings, whether it be prescribed by the president of the industrial tribunals or the Court Service, or more loosely arranged in collegiate fashion between the judges of a particular court. They are all expected to work during defined times and periods, whether they be rigidly laid down or managed by the judges themselves with a greater degree of flexibility. They are not free agents to work as and when they choose, as are self-employed persons. Their office accordingly partakes of some of the characteristics of employment . .. [Perceval-Price v Department of Economic Development [2000] IRLR 380]

and the Supreme Court took this further in O’Brien v Ministry of Justice [2010] UKSC 34 by saying “Indeed judicial office partakes of most of the characteristics of employment” (emphasis added).

Whatever their employment status, judges’ performance figures are clearly an important matter to them, and the Scottish Court Service has a duty to maintain accurate figures (particularly when disclosing them publicly). As Wodehouse said, “it has never been difficult to distinguish between a Scotsman with a grievance and a ray of sunshine”. I imagine that the office of Mr McQueen, the day after the first article, was not filled with sunshine.

Leave a comment

Filed under Data Protection, employment, FOISA, Freedom of Information, Uncategorized

On the tweet where you live

Do Home Office tweets of people arrested on suspicion of committing immigration offences engage data protection law?

The recent sordid campaign by the Home Office to publicise their “crackdown on illegal immigration” involved the tweeting of pictures of people apparently arrested in connection with immigration offences. I’m loath to post links because any further publicity risks undermining my point in this piece, but suffice to say that two pictures in particular were posted, one of a man being escorted (police officers at either side of him, holding his arms) from what look like retail premises, and one of a man being led by other officers into a cage in the back of a van. In both cases, the person’s face has been blurred by pixelation. There have been suggestions that the broader aspects of the campaign (disgracefully, vans have been deployed displaying advertisements saying “In the UK illegally? Go home or face arrest“) might be unlawful for breach of the Public Sector Equality Duty, and some have argued that to use the hashtag #immigrationoffenders to accompany pictures of people only suspected of crime might be to prejudge a trial, and could even constitute contempt of court. However, I would argue that the tweets also engage, and potentially breach, data protection law.

For the sake of this argument I will work on the presumption that, because the images of their faces have been obscured no third party can recognise the individuals concerned (I think this is actually probably wrong – potential identifying features, such as location and clothing are still displayed, and it is quite likely that friends, relative, colleagues could identify them). However, this does not mean that the images are outwith the Data Protection Act 1998 (DPA) and the European Data Protection Directive 95/46/EC to which it gives effect. The former defines personal data as

data which relate to a living individual who can be identified—
(a) from those data, or
(b) from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller [emphasis added]

In this instance the Home Office (or its agents) must itself know who the people in the images are (they will have had sufficient identifying information in order to effect an arrest) so, in their hands, the images constitute the personal data of the people in them. As the Information Commissioner’s Office (ICO) explains

It is important to remember that the same piece of data may be personal data in one party’s hands while it may not be personal data in another party’s hands…data may not be personal data in the hands of one data controller…but the same data may be personal data in the hands of another data controller…depending on the purpose of the processing and the potential impact of the processing on individuals

So the taking, retaining and publishing of images of people whose identities are obscured but who can be identified by the data controller will constitute the processing of personal data by that data controller. Consequently, the legal obligations for fair and lawful processing apply: section 4(4) of the DPA imposes a duty on a data controller to comply with the data protection principles in relation to all personal data with respect to which he is the data controller. Lord Hoffman explained this, in the leading FOI (and DPA) case on identification 

As the definitions in section 1(1) DPA make clear, disclosure is only one of the ways in which information or data may be processed by the data controller. The duty in section 4(4) is all embracing. He must comply with the data protection principles in relation to all “personal data” with respect to which he is the data controller and to everything that falls within the scope of the word “processing”. The primary focus of the definition of that expression is on him and on everything that he does with the information. He cannot exclude personal data from the duty to comply with the data protection principles simply by editing the data so that, if the edited part were to be disclosed to a third party, the third party would not find it possible from that part alone without the assistance of other information to identify a living individual. Paragraph (b) of the definition of “personal data” prevents this. It requires account to be taken of other information which is in, or is likely to come into, the possession of the data controller. Common Services Agency v Scottish Information Commissioner (Scotland) [2008] UKHL 47

So the Home Office cannot merely edit the data (by pixelation) and thus exclude it from the duty to process it in accordance with the data protection principles: these images are personal data. Moreover, they will come under the subset known as sensitive personal data, because they consist of information as to the commission or alleged commission by the data subject of any offence (they might also fall into this subset because they show the racial or ethnic origin of the data subject, but this is less certain).

The first data protection principle requires that

Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless
(a) at least one of the conditions in Schedule 2 is met, and
(b) in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met.
As this is sensitive personal data, a Schedule 3 condition must be met in order for the processing to be fair and lawful. Try as I might, I cannot find one that is (I adopt the list as explicated by the ICO)

  • The individual who the sensitive personal data is about has given explicit consent to the processing.
  • The processing is necessary so that you can comply with employment law.
  • The processing is necessary to protect the vital interests of: – the individual (in a case where the individual’s consent cannot be given or reasonably obtained), or- another person (in a case where the individual’s consent has been unreasonably withheld).
  • The processing is carried out by a not-for-profit organisation and does not involve disclosing personal data to a third party, unless the individual consents. Extra limitations apply to this condition.
  • The individual has deliberately made the information public.
  • The processing is necessary in relation to legal proceedings; for obtaining legal advice; or otherwise for establishing, exercising or defending legal rights.
  • The processing is necessary for administering justice, or for exercising statutory or governmental functions.
  • The processing is necessary for medical purposes, and is undertaken by a health professional or by someone who is subject to an equivalent duty of confidentiality.
  • The processing is necessary for monitoring equality of opportunity, and is carried out with appropriate safeguards for the rights of individuals.

It will be noted that the two conditions emphasised by me in italics might be thought to apply, but one notes the word “necessary”. In no way were these tweets “necessary” for the purposes to which those conditions relate. By contrast, when authorities publish photographs of wanted criminals, the necessity test will normally be made out. It is, I suppose, just possible that the data subjects gave their explicit consent to the tweets, but that’s vanishingly unlikely. (A question does arise as to what conditions permit the processing by the police of pixelated images of potential offenders in programmes such as “Police, Camera, Action” and “Motorway Cops”: it may be that this has never been challenged, but it may also be that the data controller is in fact the film company, who might be protected by the exemption from much of the DPA if the processing of data is for journalistic purposes).

(I would observe, in passing, that many customary practices to do with publication of information about crimes or suspicion of criminal behaviour are potentially in breach of these provisions of the DPA if they are construed strictly. Although there is the journalistic exemption mentioned above, those to whom that exemption arguably does not apply (bloggers, tweeters, police, other public authorities) are at risk of breach if they, for instance, publish identifying information about people who have criminal convictions or are suspected of having committed a crime. This area of the law, and its implications for open justice, have not, I think, been fully played out yet. For discussions about it see my post and others linked here.)

If no Schedule 3 condition can be met, the processing will not be in accordance with the first data protection principle, and the data controller will be in breach of section 4(4) of the DPA. What flows? Well, probably very little – the data subjects have a right to serve a notice (under section 10 of the DPA) requiring the cessation of processing which is causing or likely to cause substantial unwarranted damage or distress. Additionally, they have a right either to bring a civil claim for damages (very difficult to show) or to complain to the ICO. However, data subjects like this are not necessarily going to want to assert their rights in a strident way. The ICO himself could intervene – he has the power to take enforcement action if he is satisfied a data controller has contravened or is contravening the data protection principles (and, much to his credit, he has recently issued notices against a Council which was requiring taxi drviers to instal CCTV/audio recording facilities in all cabs, and against a Police force which was operating a “ring of steel” ANPR network). It appears though that the Home Office twitter account has gone quiet (it hasn’t tweeted in several days). Perhaps there have been second thoughts not just about the legality, but also the morality, of the campaign. I am always the optimist.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under Data Protection, Home Office, human rights, Information Commissioner, journalism, police

Let’s blame Data Protection (a new series): Part One

Data Protection is to blame for many things (sleepness nights for Data Protection officers, hits to the public purse,  a proportionate measure of respect and security for people’s sensitive private information, bulging wallets for lawyers) and many people like to criticise it. In this occasional series I want to come to its defence, by pointing out examples where data protection has been wrongly blamed for a failure elsewhere. The Information Commissioner used to do something similar but seems to have given up with that (and, after all, “data protection duck out” is a cringemaking phrase).

So here’s my first example: “Vague” Data Protection Act blights fraud detection, say insurers

The facts of the article itself are fine, as one would expect if the author is Pete Swabey, but it’s the message itself that grates. According to the Chartered Insurance Institute (CII), there is a problem with section 29 of the Data Protection Act 1998 (DPA), which permits the disclosure of personal data by a data controller, whereby the general presumption against non-disclosure is disapplied if applying it would be likely to prejudice any of the following purposes: the prevention or detection of crime, the apprehension or prosecution of offenders, or the assessment or collection of any tax or duty or of any imposition of a similar nature. Normally the question whether to disclose will arise in response to a specific request from another person or body (normally one with crime detection or prosection powers, or tax collection powers). This comes down to a matter of applying a balancing test to specific facts: if I don’t disclose this information, would it be likely to cause prejudice to those purposes?

This is often a difficult decision for a data controller (it’s about serious matters – why should it always be easy?). But the CII complain that

the vagueness of Section 29…has led to an extremely high volume of information requests, with little consistency or clarity. This, it says, is hindering investigations. 

“Certain companies, particularly the lawyers, are sending requests out without thinking about them,” [says] David Clements, motor investigations manager at Zurich

Bad Data Protection Act! Making people ask for disclosure of personal data without giving it much thought!

Also, the fact that requests and responses are made in a haphazard, non-standard fashion creates unnecessary work for fraud investigators.

Silly Data Protection Act! Making an industry incapable of standardizing procedures!

And, indeed, the article says that the industry is trying to sort itself out

The New Generation Claims Board is working on a voluntary code of best practice to help insurance providers both improve the efficacy of their fraud investigations and reduce their risk of non-compliance. 

“We’re going to provide the industry with a best practice protocol plus a template for sending and receiving requests,” Clements explains.

But the evil Data Protection Act is still lurking about causing trouble, because this is only a voluntary scheme

as insurance companies are not even obliged to respond to Section 29(3) requests

Come on Data Protection Act, sort yourself out!

Leave a comment

Filed under Data Protection, Information Commissioner, Let's Blame Data Protection, Uncategorized