Category Archives: Data Protection

Don’t Panic about the Royal Charter. Panic Now!

Bloggers shouldn’t panic about the proposed Royal Charter, unless they’re already panicking about the current law.

Imagine that a local citizen blogger – let’s call her Mrs B, who is a member of a local church group – decides to let others know, by way of a website, some news and information about the group. She includes information for those about to be confirmed into the church as well as extraneous, light-hearted stuff about her fellow parishioners, including the fact that one of them has a broken leg. Now imagine that a complaint by one of the fellow parishioners that this website is intrusive is upheld and Mrs B is found to have breached domestic law.

The coercive power of the state being brought against a mere blogger would be, you might imagine, unacceptable. You might imagine that any such domestic law, in a country which is a signatory to the European Convention on Human Rights, would be held to be in breach of the free-expression rights under Article 10 of the same.

This sort of outcome, you might say, would surely be unimaginable even under the proposed regulatory scheme by Royal Charter agreed in principle by the main party leaders on 18 March.

But, as anyone who knows about data protection law will tell you, exactly this happened in 2003 in Sweden, when poor Mrs Bodil Lindqvist was prosecuted and convicted under national Swedish legislation on data protection and privacy. On appeal to the European Court of Justice her actions were held to have been the “processing” of “personal data” (and, in the case of the person with the injured leg, of the higher-category “sensitive personal data”) and thus those actions engaged Article 3(1) of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data which is given domestic effect in Sweden by the law under which she was convicted. The same Directive is, of course, given domestic effect in the UK by the Data Protection Act 1998 (DPA).

The response to the proposed Royal Charter was heated, and many people noticed that the interpretative provisions in Schedule 4 implied the regulation of web content in general (if said content was “news-related material”), thus potentially bringing the “blogosphere” and various social media activities into jurisdiction. This has caused much protest. For instance Cory Doctorow wrote

In a nutshell, then: if you press a button labelled “publish” or “submit” or “tweet” while in the UK, these rules as written will treat you as a newspaper proprietor, and make you vulnerable to an arbitration procedure where the complainer pays nothing, but you have to pay to defend yourself, and that will potentially have the power to fine you, force you to censor your posts, and force you to print “corrections” and “apologies” in a manner that the regulator will get to specify.

But the irony is, that is effectively exactly the position as it currently stands under data protection law. If you publish or submit or tweet in the UK information which relates to an identifiable individual you are “processing” “personal data”. The “data subject” can object if they feel the processing is in breach of the very broad obligations under the DPA. This right of objection is free (by means of a complaint to the Information Commissioner’s Office (ICO)). The ICO can impose a monetary penalty notice (a “fine”) up to £500,000 for serious breaches of the DPA, and can issue enforcement notices requiring certain actions (such as removal of data, corrections, apologies etc) and a breach of an enforcement notice is potentially a criminal offence.

As it is, the ICO is highly unlikely even to accept jurisdiction over a complaint like this. He will say it is covered by the exemption for processing if it is “only for the purposes of that individual’s personal, family or household affairs (including recreational purposes)”. He will say this despite the fact that this position is legally and logically unsound, and was heavily criticised in the High Court, where, in response to a statement from the ICO that

The situation would clearly be impossible were the Information Commissioner to be expected to rule on what it is acceptable for one individual to say about…another individual. This is not what my office is established to do. This is particularly the case where other legal remedies are available – for example, the law of libel or incitement.

Mr Justice Tugendhat said

 I do not find it possible to reconcile the views on the law expressed in the Commissioner’s letter with authoritative statements of the law. The DPA does envisage that the Information Commissioner should consider what it is acceptable for one individual to say about another, because the First Data Protection Principle requires that data should be processed lawfully. The authoritative statements of the law are to be found not only in the cases cited in this judgment (including para 16 above), but also by the Court of Appeal in Campbell v MGN Ltd [2002] EWCA Civ 1373 [2003] QB 633 paras [72] to [138], and in other cases. As Patten J made clear in Murray, where the DPA applies, if processing is unlawful by reason of it breaching the general law of confidentiality (and thus any other general law) there will be a contravention of the First Data Protection Principle within the meaning of s.40(1), and a breach of s.4(4) of the DPA…The fact that a claimant may have claims under common law torts, or under HRA s.6, does not preclude there being a claim under, or other means of enforcement of, the DPA.

The ICO will decline jurisdiction because, in reality, he does not have the resources to regulate the internet in its broadest sense, and nor does he have the inclination to do so. And I strongly suspect that this would also be the position of any regulator established under the Royal Charter.

I’m not normally one for complacency, and I actually think that the fact that the coercive power of the state potentially applies in this manner to activities such as blogging and tweeting is problematic (not wrong per se, note, but problematic). But the fact is that, firstly, the same coercive power already applies, to the extent that such activities engage, for instance, defamation law, or contempt of court, or incitement laws, and secondly – and despite the High Court criticism – no one seems to be particularly exercised by the fact that the current DPA regulator is able to ignore the activities of the blogosphere, so I doubt that the social and legal will exists to regulate these activities. I hope I’m not wrong.

3 Comments

Filed under Data Protection, human rights, Information Commissioner, monetary penalty notice, Privacy

Google Streetview and “Incidental” Processing

Someone I follow on twitter recently posted a link from Google Streetview of the interior of a pub, in which he could identify himself and a friend having a quiet pint. I must confess this addition of building interiors to the Streetview portfolio had passed me by. It appears that businesses can sign-up to have “Google Trusted Photographers and Trusted Agencies” take photographs of their premises, which are uploaded to the web and linked to Streetview locations.

When it was launched Streetview caused some concern in privacy circles, and this was prior to, and separate from, the concerns caused by the discovery that huge quantities of wifi payload data had been gathered and retained during the process of capture of streetview data. These more general concerns were partly due to the fact that, in the process of taking images of streets the Google cameras were also capturing images of individuals. Data protection law is engaged when data are being processed which relate to a living individual, who can identified from the data. To mitigate against the obvious potential privacy intrusions from Streetview, Google used blurring technology to obscure faces (and vehicle number plates). In its 2009 response to Privacy International’s complaint about the then new service the Information Commissioner’s Office said

blurring someone’s face is not guaranteed to take that image outside the definition of personal data. Even with a face completely removed, it will still be entirely likely that a person would recognise themselves or someone close to them. However, what the blurring does is greatly reduce the likelihood that lots of people would be able to identify individuals whose image has been captured. In light of this, our analysis of whether and to what extent Streetview caused data protection concerns placed a great deal of emphasis on the fact that at its core, this product is in effect a series of images of street scenes…the important data protection point is that an individual’s presence in a particular image is entirely incidental to the purpose for capturing the image as a whole. (emphasis added)

One might have problems with that approach (data protection law does not talk in terms of “incidental” processing of personal data) but as an exercise in pragmatism it makes sense. However, it seems to me that the “business interiors” function of Streetview takes things a step further. Firstly, these are not now just “images of street scenes”, and secondly, it is at least arguable that an individual’s presence in, for instance, an image of an interior of a pub, is not “entirely incidental” to the image’s purpose.

Google informs the business owner that “it would be your responsibility to notify your employees and customers that the photo shoot is taking place” but that “Google may use these images in other products and services in new ways that will make your business information more useful and accessible to users”. It seems likely to me therefore that, to the extent that personal data is being processed in the publishing of these images, Google and the business owner are potentially both data controllers (with consequent responsibilities and liabilities under European law).

It would be interesting to know if the Information Commissioner’s assessment of this processing would be different given that a factor he previously placed a “great deal of emphasis on” (the fact that Streetview was then “just images of street scenes”) no longer applies.

1 Comment

Filed under Data Protection, enforcement, Information Commissioner, Privacy

We still have judgment here

Mr Justice Tugendhat makes very interesting observations about reserved judgments and open justice,  in a judgment on whether a defendant is in breach of prior undertakings relating to tawdry publications about the parents of Madeline McCann:

The decision not to identify in a reserved judgment a fact or person that has been identified in open court is not a reporting restriction, nor any other derogation from open justice. The hearing of this committal application was in public in the usual way. The decision not to set out everything in a judgment is simply a decision as to how the judge chooses to frame the judgment (¶86)

I have previously written about discussions taking place about the privacy and data protection implications of electronic publication of lists from magistrates’ courts, and I also wrote a thesis (NEVER to see the light of day thank you very much) which attempted in part to deal with the difficulties of anonymisation in court documents. These seem to me to be very urgent, and tremendously difficult, considerations for the subject of open justice in the digital era (the title of the initiative, led by Judith Townend, to “make recommendations for the way judicial information and legal data are communicated in a digital era”).

The judgment continues with Tugendhat J observing that, in previous cases where he has referred to parties by initials in reserved judgments this has sometimes been misinterpreted as his having made an anonymity order. Not true: the proceedings themselves were in open court, but

what happens in court, if not reported at the time, may be ephemeral, and may soon be forgotten and become difficult to recover, whereas a reserved judgment may appear in law reports, or on the internet, indefinitely (¶87)

This is a crucial point. My concern has always been about the permanence of information published on the internet, and the potential for it to be used, and abused, in ways and under jurisdictions, which would make a mockery of, for instance, the Rehabilitation of Offenders Act 1974, and the Data Protection Act 1998.

I haven’t noted the judge’s comments for any particular reason, other than I think they helpfully illustrate some important points, and might provoke some discussion.

1 Comment

Filed under Confidentiality, court lists, Data Protection, Open Justice, Privacy, Rehabilitation of offenders

Smeaton v Equifax overturned

The Court of Appeal has overturned what had seemed an important, if controversial, judgment on the legal duties owed by Credit Reference Agencies to those about whom they hold records and issue reports.

I blogged in May last year  about a high court claim for damages under section 13 of the Data Protection Act 1998 (DPA). The claimant, Mr Smeaton, successfully argued that, as a result of processing inaccurate data about his credit history, the Credit Reference Agency (CRA) Equifax was in breach of the fourth data protection principle, and that Equifax’s obligations under the DPA as a data controller meant that it owed a duty of care to Smeaton in tort. Accordingly, damages were owed (to be assessed at a later date).

The case has now been comprehensively overturned in the Court of Appeal. Primarily, the appeal succeeded because the judge’s findings on causation (i.e. had the inaccuracy in Mr Smeaton’s credit record led to the detriment pleaded?) were not sustainable. Lord Justice Tomlinson, giving the lead judgment, was highly critical of the judge’s approach

the judge’s conclusion that the breaches of duty which he identified caused Mr Smeaton loss in that they prevented Ability Records from obtaining a loan in and after mid-2006 is in my view not just surprising but seriously aberrant. It is without any reliable foundation and completely unsupported, indeed contradicted, by the only evidence on which the judge could properly rely (¶11)

That effectively dispensed with the claim for damages, but Equifax, clearly concerned about the implications of the original findings regarding a breach of the DPA and consequent breach of a duty of care, asked the Appeal Court to consider these points as well.

Was there a DPA breach?

Tomlinson LJ held that the procedures which obtained at the time of the alleged DPA breach, regarding the annulment (and communication thereof) of bankruptcy orders, had never been the subject of the expression of any concern by either the Information Commissioner or the Insolvency Service. In the first instance the judge had observed that inaccurate personal data could be “particularly damaging”. Tomlinson LJ did not demur, but said that

it is necessary to put this important principle into context and to maintain a sense of proportion. In the context of lending, arrangements have been put in place to ensure that an applicant for credit should not suffer permanent damage as a result of inaccurate information appearing on his file (¶59)

Those arrangements are described in guidance both published by or approved by the Information Commissioner, and include the fact that, in the event of a failed credit application

[the] lender must tell a failed applicant by reference to the data of which CRA an application was declined, if it was, and the failed applicant, like any consumer, has the right to obtain a copy of his file from a CRA on payment of £2.00

and mistakes can thus be corrected.

Moreover, CRAs must, by reference to the Guide to Credit Scoring 2000, not decline a repeat application “solely on the grounds of having made a previously declined or accepted application to that credit grantor”. This, and other guidance, were inbuilt safeguards against the kind of detriment Mr Smeaton claimed to have suffered. Ultimately

Equifax did take steps to ensure that its bankruptcy data was accurate. It obtained the data from a reliable and authoritative source in the form of the [London] Gazette, it transferred the data accurately onto its data bases from that source and it amended its data immediately upon being made aware that it was inaccurate…the judge was wrong to conclude that Equifax had failed to take reasonable steps to ensure the accuracy of its data (¶81)

Was there a co-extensive duty of care in tort?

Here Tomlinson LJ considered the “traditional three-fold test of foreseeability, proximity and whether it is fair, just and reasonable to impose a duty” and held comprehensively that there was not. He agreed with counsel for Equifax’s argument that

(1)It is doubtful whether it was reasonably foreseeable that the recording of incorrect data on Mr Smeaton’s credit reference would cause him any loss…
(2)It would also not be fair, just or reasonable to impose a duty. In particular, imposing a duty owed to members of the public generally would potentially give rise to an indeterminate liability to an indeterminate class…
(3)It would also be otiose given that the DPA provides a detailed code for determining the civil liability of CRAs and other data controllers arising out of the improper processing of data
(4)Parliament has also enacted detailed legislation governing the licensing and operation of CRAs and the correction of inaccurate information contained in a credit file in the CCA 1974. This provides for the possibility of criminal sanctions, but does not create any right to civil damages. In such circumstances it would not be appropriate to extend the law of negligence to cover this territory (¶75)

The third of these seems to make it clear that the courts will be reluctant to allow for a notion of an actionable duty of care on data controller to process personal data fairly and lawfully. (This is in contrast, interestingly, with the situation in Ireland, whereby a statutory provision (section 7 of the Data Protection Act 1988) states that such a duty of care is owed (at least to the extent that “the law does not so provide”)).

My post on the first instance case has been one of the most-read (it’s all relative, of course – there haven’t been that many readers) so I think it only correct to post this update following the Court of Appeal judgment.

2 Comments

Filed under Data Protection, Information Commissioner, Uncategorized

Courts, Contempt and Data Protection

Can it be possible for HM Courts and Tribunals Service – who have responsibility for publishing court lists – to publish those same lists in an unlawful way?

Richard Taylor, a blogger and mySociety volunteer uploaded an intriguing blog post recently. Entitled Cambridge Magistrates Court Lists Obtained via Freedom of Information Request it described Richard’s request to HM Courts and Tribunals Service (HMCTS) for

 …the information which would be expected to appear on the full copy of the court list in relation to appearances, hearings, trials etc. currently scheduled to be held in Cambridge Magistrate’s Court [five specified days]

HMCTS, commendably, in Richard’s words (amazingly, in mine), responded to him within six days. The disclosure was, by any standards, extraordinary. Richard had made the request using the whatdotheyknow.com portal. This service means that any disclosure made by a public authority is by default uploaded to the internet for anyone to see. What was uploaded by HMCTS included

 …the identity of victims of crimes people were being charged with, including a girl under 14 who was named in relation to an indecent assault charge

As Richard points out, the anonymity of victims of alleged sexual offences is protected by law. Section 1 of the Sexual Offences (Amendment) Act 1992 (SO(A)A) provides that

neither the name nor address, and no still or moving picture, of [a victim of an alleged sexual offence] shall during that person’s lifetime…be published in England and Wales in a written publication available to the public

These necessary derogations from the principles of open justice cannot extend to complete anonymity. For obvious reasons, the name of a victim of an alleged sexual offence will need to be before a court in the event of a trial. So, the meaning of a “written publication available to the public” does not include (per s6 SO(A)A)).

an indictment or other document prepared for use in particular legal proceedings

It appears that the lists disclosed to Richard would fall into this category. However disclosure of such a document under FOIA, which is taken to be disclosure to the world at large (and, in the case of whatdotheyknow.com effectively is) would extend its “use” so far beyond those particular legal proceedings that it would undermine the whole intention of section of SO(A)A. It seems that HMCTS recognised this, because they subsequently contacted Richard and confirmed that the information was disclosed in error.

We believe the majority of the information in the Court Lists is exempt from disclosure under Section 32 (Court Records) and Section 40 (Personal Information) of the Freedom of Information Act. We also believe provision and publication of sensitive personal data may also breach The Data Protection Act.

Well, I hate to be a tell-tale, but this seems to be a tacit admission that the disclosure to Richard was an extremely serious breach of the Data Protection Act 1998 (DPA). It was also potentially in breach of SO(A)A and potentially an act of contempt under the Magistrates’ Courts Act 1980 (MCA), section 8(4) of which permits publication only of certain information relating to commital proceedings, before a trial, and the names of alleged victims certainly does not fall under that sub-section. But can a court (or at least, a court service) be in contempt of itself by digitally disclosing (publishing) to the world information which it is required otherwise to disclose publicly?

While distinction should be drawn between a “full” list, such as was inadvertently disclosed to Richard, and “noticeboard” lists, habitually stuck up outside the court room, the points raised by this incident exemplify some crucial considerations for the development of the justice system in a digital era. It seems clear that, even if a court were permitted to  this or similar information, the re-publication by others would infringe one or all of the SO(A)A, DPA and MCA. What this means for the advancement of open justice, the protection of privacy rights and indeed the rehabilitation of offenders is something I hope to try to grapple with in a future post (or posts).

3 Comments

Filed under Breach Notification, court lists, Data Protection, Open Justice, Rehabilitation of offenders

Opt Me Out! Please

Do some barriers to opting out of direct marketing risk a breach of the Data Protection Act?

I’m trying to open a credit card account: long interest-free periods are useful for those who are careful with their money. They’re also useful for people like me.

My application was going fine until the point at which I was asked to agree to their policy on the use of my information for marketing purposes. This says

[Generic Financial Services Company] may inform me of special offers, products and services, either by letter, telephone or e-mail. If I am a new GFSC customer and I do not wish to receive marketing material by letter, telephone or email, or any combination of these I can write to you at GFSC, Marketing opt-out, FREEPOST XXXX

Thanks GFSC, but I don’t have to send you snail mail to opt-out of marketing. Section 11 of the Data Protection Act 1998 (DPA) simply says I can serve a notice in writing requiring you to cease, or not to begin, processing my personal data for the purposes of direct marketing. “In writing” includes, by virtue of section 64 of the DPA, email.

So I agreed to the terms of their marketing statement (I didn’t have to do that by snail mail, of course – I just ticked a box) and then very cleverly emailed them serving a section 11 notice requiring them not to being marketing, and asking them to confirm receipt of the notice.

However, I’ve now received a friendly email saying

Thank you for your message. The email service you have used is not 100% secure and we’re unable to reply to you using this service.  Emails can be intercepted which is why we provide secure messaging within our Online Banking facility.  I’m unable to access your account details and provide the information you require. I want to answer your query, but in a secure environment…

I didn’t “require” any specific information (other than an acknowledgement of receipt) and I was not wishing to discuss any matters which required secure email correspondence (I had freely provided my name and address). And I don’t have account details, because they haven’t accepted me as a customer yet.

So now I’m in limbo. I agreed to receive direct marketing, by ticking an online box, but immediately served a section 11 notice which they presumably won’t pay any attention to.

However, in strict terms the fact I got a reply to my email confirms that my notice was received. It may not mean I won’t get direct marketing, but it does probably mean that any such marketing would be sent to me unlawfully, in breach of section 11 of the DPA, as well as the first, second and sixth principle in Schedule One, and (therefore) section 4(4).

Having said all this I’m not sure I should name this nation wide financial institution, because I still want the service, and my principles don’t quite extend to withdrawing my application under these circumstances. I’m left wondering what I should do?

2 Comments

Filed under Data Protection

A Fairy Tale of Wilmslow

A clunkingly fatuous fairy tale for Christmas

Once upon a time, in a land far away, there were villages where the villagers were told by the king to look after some valuable possessions of other people, and though they tried hard to protect these items, they had limited money with which to do so.

Most villagers did everything they could to protect these precious items, but sometimes the village elders overlooked the risks, or decided to spend some of the villages’ meagre earnings on other important things. And sometimes some of the stupid villagers took risks, or other villagers, thought they were not stupid, still took stupid risks. This all meant that, just sometimes, the valuable items got lost, or given to the wrong people, or maybe even stolen.

The Sheriff of the Land was a good and strong man, and he too was worried about these precious items. He encouraged village elders to tell him when something happened to the items. When he thought the villages had really been bad, or unwise, he would fine them, and so they had even less money. And the villages would try very hard to improve, and they would listen to all the Sheriff’s edicts, and try to do what was right.

Most people in the Land, and in the villages themselves, accepted this: they knew that it was important that the sheriff showed everyone he was strong, and wouldn’t tolerate loss of or risk to the precious items.

However, in the towns, there were people who had also been asked by the king to look after others’ valuable possessions. Some of these people were very irresponsible, and they often lost the items, or had them stolen, and, what was worse, they wouldn’t confess this to the sheriff. And even though the sheriff knew about this, he mostly allowed the lawlessness to continue, because it was so rife, and because some of the townspeople were very powerful.

And so it was that the villagers found it hard to bear when the Sheriff issued public proclamations that said how badly they – even those in villages which had never done anything wrong – protected the precious items. They found it especially hard to bear because it was their own precious items which were being treated with so little care in the Outlaw Towns.

Information Commissioner Christopher Graham said yesterday:

“We are fast approaching two million pounds worth of monetary penalties issued to UK councils for breaching the Data Protection Act, with nineteen councils failing to have the most straightforward of procedures in place

“It would be far too easy to consider these breaches as simple human error. The reality is that they are caused by councils treating sensitive personal data in the same routine way they would deal with more general correspondence. Far too often in these cases, the councils do not appear to have acknowledged that the data they are handling is about real people, and often the more vulnerable members of society.

“The distress that these incidents would have caused to the people involved is obvious. The penalties we have issued will be of little solace to them, but we do hope it will stop other people having to endure similar distress by sending out a clear message that this type of approach to personal data will not be tolerated.

“There is clearly an underlying problem with data protection in local government and we will be meeting with stakeholders from across the sector to discuss how we can support them in addressing these problems.”

2 Comments

Filed under Data Protection, Information Commissioner, satire

MPs and Data Protection Offences, part etc etc

In which I bore again by banging on about the ICO’s apparent non-action against MPs who might be committing Data Protection offences

I’ve blogged on this before. To recap: MPs have the same obligations as any other data controller under section 17 of the Data Protection Act 1998 (DPA) to notify the Information Commissioner’s Office (ICO) of their processing of personal data. Most do so, some appear not to. Processing personal data without a notification or a suitable exemption constitutes a criminal offence under section 18 of the DPA.

In my previous posts I’ve question why the ICO appears to take a lenient approach to MPs’ legal obligations. Maybe I’ve made more of it than I should, and I’m pleased to see that the majority I named in my second post on the subject have now put things right.

However, two of the names in that previous list continue not to have an entry on the ICO register. There may be a reason for this (the list may not, for instance, have been updated) but it suggests that Jim Shannon MP has processed personal data without an appropriate registration since his last notification expired on 29 November 2010 and Pat Doherty MP has similarly processed personal data since 20 January 2011.

It’s not as though the ICO never prosecutes for this offence. He announced on twitter today that there had been a successful prosecution of two spamming scumbags owners of a marketing company for non-notification (both received £2000 fines). While reading this, I noticed that there had also been, on 28 November, a successful prosecution (she pleaded guilty) of a barrister for the same offence. For reasons of mitigating circumstances she received an absolute discharge. However, the ICO reports that

the magistrate warned that those whose profession is to prosecute people for failing to comply with the law must meet their legal obligations

If this magistrate can warn lawyers to observe their legal obligations, because they (act for those who) prosecute offences, where is the warning from the prosecutor to those who actually make the laws?

1 Comment

Filed under Data Protection, Information Commissioner

An Irresponsible Press Release?

What is the basis for the ICO saying the private sector is better at data protection than the public?

I defended the Information Commissioner’s Office (ICO) today, over a poor Register headline which suggested they were “red-faced” about imposing monetary penalty notices on NHS bodies (of course they’re not). To their great credit, the Register reworded the headline. Shortly afterwards, the ICO issued a headline of their own in a press release

Private Sector leads the way on data protection compliance but room for improvement elsewhere

Behind this headline are four reports on the ICO’s Data Protection Act 1998 (DPA) audit activities over the last two years. Each report relates to a “sector”, so we have:

Audit outcomes, central government (February 2010 – July 2012)

Audit outcomes, local authorities (February 2010 – July 2012)

Audit outcomes, NHS (February 2010 – July 2012)

Audit outcomes, private sector (February 2010 – July 2012)

Ignore for a moment the fact that the distinction between “private” and “public” sector is increasingly an artificial one – what I want to focus on is the evidential basis for the assertions made by the ICO, and why I think they are potentially damaging to the interests of data subjects. The press release goes on to say

[the reports have] highlighted the positive approaches many private sector companies are adopting to look after people’s data. However concerns remain about data protection compliance within the local government sector and the NHS…Within the private sector, the ICO had a high level of assurance that 11 out of the 16 companies audited had policies and procedures in place to comply with the Act…In the health service only one of the 15 organisations audited provided a high level of assurance to the ICO, with the local government sector showing a similar trend with only one out of 19 organisations achieving the highest mark. Central government departments fair little better with two out of 11 organisations achieving the highest level of assurance.

Let’s stop for a second to consider the nature of the audits we are looking at. The ICO does not have a general power to audit data controllers without their consent, although he does have that power over central government data controllers. So how does a data controller come to consent to an ICO audit? Very commonly it’s a result of a self-reported data breach, or following an ICO investigation giving rise to DPA concerns. The three arms of the public sector represented in these reports are required or expected to comply with specific data protection guidance: for central government it is the Cabinet Office Data Handling Procedures, for Local Government the LGA/SOCITM Data Handling Guidelines (derived from the Cabinet Office procedures), and for the NHS, the very robust Information Governance Toolkit. Each of these contains explicit directions that a serious DPA breach be reported to the ICO.

There is, of course, no such guidance for the “private sector” (although the ICO encourages data controllers, whether public or private sector, to self-report breaches).

Similarly, public sector organisations are subject to public law obligations and public-law-based corporate governance procedures which create an expectation that any breaches be self-reported and an expectation that they will agree to a suggestion by the ICO of a consensual audit.

Private sector organisations, while they have corporate governance obligations, are quite different. Responsibility to shareholders or owners is not the same thing as a public obligation.

What this means is that there are huge questions about how representative is the sample of audited organisations cited by the ICO in support of the contention that the “private sector leads the way on data protection compliance”. Additionally, the numbers used to draw this conclusion are so small that, even if the sectors were fully comparable, I doubt whether they would have statistical significance.

I’m not going to list the numerous examples of private sector poor compliance which arguably give lie to the ICO’s contention. I’m not even going to moan much about the fact that we will see this headline unthinkingly regurgitated over the following weeks.

But what I am going to say is I think this was an irresponsible press release. It was irresponsible because I simply cannot accept the universal premise of a statement that “the private sector leads the way on data protection compliance”. And because I can imagine that, somewhere, while a public sector data protection officer is shrugging his or her shoulders and going about his or her task with an extra dose of world-weariness, somewhere else, a private sector management board is thinking that perhaps it doesn’t need to worry too much about data security, and regulation by the ICO.

UDPATE: 12.10.12

I’ve had an email from a nice spokesman from the ICO press office, who wanted to give some further context, and clarified one point. He said

Motivation for agreeing to audit is undoubtedly a relevant context to the results we published, particularly given that, as you highlight, the ICO doesn’t have the power to compel organisations to submit to an audit. It isn’t true, though, that public sector audits are often the result of self-reported data breaches. In fact, most of our audits come from the ICO writing to organisations and asking them to volunteer, not as a direct result of a breach being reported.

Fair point, and I’m happy to clarify that most times the ICO invites organisations to volunteer for an audit not as a direct result of a breach being self-reported. Although I am pretty certain the ICO would not be sending that invite if he hadn’t determined, either as a result of a self-reported breach, or a complaint from a data subject, that there had been a breach of the DPA.

The spokesman went on to say

This is much the same as our approach to the private sector, though fewer private sector firms take up the opportunity, as we highlight in our report (perhaps due to the responsibility to shareholders versus public obligation argument you highlight in your blog).

I’m glad that there is, there, an implicit admission that audited public and private sector data controllers are not directly comparable. I rather wish the press release had said this.

But this next bit I’m not sure about

One of the purposes of this type of press release is to increase that take up and share best practice, by highlighting the availability of our audits.

Now, I’ve often, when training external (public sector) organisations, suggested to them that, if they feel relatively confident about their data protection compliance, they should consider inviting the ICO to audit them, because their auditors are fair, thorough and experienced (by the way, I advise those who are not confident about their compliance to get a consultant in first…). However, I’m not sure I could so readily recommend the ICO audit now, given what I maintain are the unfair comparisons which were drawn in this press release. Indeed, two public sector officers have now stated to me on twitter that this has actively dissuaded them from volunteering for an audit. That cannot be good.

8 Comments

Filed under Breach Notification, Data Protection, Information Commissioner

Data Security and Churnalism

On the lazy reporting of a silly story about increases in data breaches

Over the past couple of days the following have all published stories on the fact that data breaches in the UK have “rocketed” or “spiked” by an “alarming” 1000% over the last five years.

Computer Business Review
Techweek Europe
The Nextweb
Public Service
Help Net Security
V3.co.uk
Computing.co.uk
SC Magazine
UKAuthority.com
The Register
Computer World UK
The BBC

These are mostly well-respected news sources, serving either the tech industries or the public sector. All of them report this story as though the news that self-reporting to the Information Commissioner of serious data breaches is a bad thing. I’ve given the links to the stories not because I want to increase their clicks, but to show the remarkable similarity between them. This is not surprising, as they are all picking up on a press release by Imation (ironically, as a non-hack, I don’t have access to it) which was issued following an FOI request to the Information Commissioner. The response to the request showed that, indeed, in 2007-08 the number of breaches reported to the ICO was 79, and in 2011-12 it was 828. But does that really mean that “Data breaches in the UK have increased tenfold in the past five years” as the BBC put it?

The answer, certainly, is “no”.

The reporting of breaches has increased by that proportion. But that is not particularly surprising. As far as I recall the first guidance issued by the ICO on reporting serious breaches was only issued in July 2010.  Before that while there may have been an inferrable assumption that serious breaches should be reported, there was not much in the way of clear direction or expectation until relatively recently. This expectation has become much more explicit since the ICO gained powers to issue civil monetary penalties for serious breaches. Now, all major data controllers know that when there is a serious breach of data security it needs to be reported to the ICO (and for telecoms providers, there is a lawful requirement to do so under the Privacy and Electronic Communications (EC Directive) Regulations 2003).

But is it a bad thing that numbers of reported incidents has increased? Of course not. All breaches of data security are to be regretted, and lessons learnt to avoid they don’t recur. But data controllers need to be encouraged to recognise breaches, and put their hands up when they happen. The ICO even considers self-reporting to be a mitigating factor when assessing what action he should take.

I doubt that many, if any of the people writing for the websites I link to above really think that data security breaches (rather than reports of breaches) have increased 1000% over five years. I’m sure their writers and reporters are very busy, and an eye-catching press release makes for easy copy. But these websites (with the execption of the BBC) are important and specialist sources of information. For them to resort to “churnalism” (a form of journalism in which press release…are used to create articles…without undertaking further research or checking) at the expense of common-sense, especially when it might lead to greater reluctance to self-report, is greatly to be regretted.

 

 

 

 

 

 

 

 

1 Comment

Filed under Breach Notification, Data Protection, Information Commissioner, PECR