Category Archives: Europe

Data protection and fake pornography

Wired’s Matt Burgess has written recently about the rise of fake pornography created using artificial intelligence software, something that I didn’t know existed (and now rather wish I hadn’t found out about):

A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.

The FacesApp, created by Reddit user DeepFakesApp, uses fairly rudimental machine learning technology to graft a face onto still frames of a video and string a whole clip together. To date, most creations are short videos of high-profile female actors.

The piece goes on to discuss the various potential legal restrictions or remedies which might be available to prevent or remove content created this way. Specifically within a UK context, Matt quotes lawyer Max Campbell:

“It may amount to harassment or a malicious communication,” he explains. “Equally, the civil courts recognise a concept of ‘false privacy’, that is to say, information which is false, but which is nevertheless private in nature.” There are also copyright issues for the re-use of images and video that wasn’t created by a person.

However, what I think this analysis misses is that the manipulation of digital images of identifiable individuals lands this sort of sordid practice squarely in the field of data protection. Data protection law relates to “personal data” –  information relating to an identifiable person – and “processing” thereof. “Processing” is (inter alia)

any operation…which is performed upon personal data, whether or not by automatic means, such as…adaptation or alteration…disclosure by transmission, dissemination or otherwise making available…

That pretty much seems to encapsulate the activities being undertaken here. The people making these videos would be considered data controllers (persons who determine the purposes and means of the processing), and subject to data protection law, with the caveat that, currently, European data protection law, as a matter of general principle, only applies to processing undertaken by controllers established in the European Union. (In passing, I would note that the exemption for processing done in the course of a purely personal or household activity would not apply to the extent that the videos are being distributed and otherwise made public).

Personal data must be processed “fairly”, and, as a matter of blinding obviousness, it is hard to see any way in which the processing here could conceivably be fair.

Whether victims of this odious sort of behaviour will find it easy to assert their rights, or bring claims, against the creators is another matter. But it does seem to me to be the case here, unlike in some other cases, that (within a European context/jurisdiction) data protection law potentially provides a primary initial means of confronting the behaviour.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Europe, fairness, Uncategorized

Anti-EU campaign database – in contravention of data protection laws?

The site reports that an anti-EU umbrella campaign called Leave.EU (or is it has been written to by the Information Commissioner’s Office (ICO) after allegedly sending unsolicited emails to people who appear to have been “signed up” by friends or family. The campaign’s bank-roller, UKIP donor Aaron Banks, reportedly said

We have 70,000 people registered and people have been asked to supply 10 emails of friends or family to build out (sic) database

Emails sent to those signed up in this way are highly likely to have been sent in breach of the campaign’s obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and the ICO is reported to have to written to the campaign to

inform them of their obligations under the PECR and to ask them to suppress [the recipient’s] email address from their databases

But is this really the main concern here? Or, rather, should we (and the ICO) be asking what on earth is a political campaign doing building a huge database of people, and identifying them as (potential) supporters without their knowledge? Such concerns go to the very heart of modern privacy and data protection law.

Data protection law’s genesis lie, in part, in the desire, post-war, of European nations to ensure “a foundation of justice and peace in the world”, as the preamble to the European Convention on Human Rights states. The first recital to the European Community Data Protection Directive of 1995 makes clear that the importance of those fundamental rights to data protection law.

The Directive is, of course, given domestic effect by the Data Protection Act 1998 (DPA). Section 2 of the same states that information as to someone’s political beliefs is her personal data: I would submit that presence on a database purporting to show that someone supports the UK”s withdrawal from the European Union is also her personal data. Placing someone on that database, without her knowledge or ability to object, will be manifestly “unfair” when it comes to compliance with the first data protection principle. It may also be inaccurate, when it comes to compliance with the fourth principle.

I would urge the ICO to look much more closely at this – the compiling of (query inaccurate) of secret databases of people’s political opinions has very scary antecedents.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Directive 95/46/EC, Europe, human rights, Information Commissioner

Google’s Innuendo

If you search on Google for my name, Jon Baines, or the full version, Jonathan Baines, you see, at the foot of the page of search results

Some results may have been removed under data protection law in Europe. Learn more

Oh-ho! What have I been up to recently? Well, not much really, and certainly nothing that might have led to results being removed under data protection law. Nor similarly, have John Keats, Eleanor Roosevelt and Nigel Molesworth (to pick a few names at random), a search on all of whose names brings up the same message. And, of course, if you click the hyperlink marked by the words “Learn more” you find out in fact that Google has simply set its algorithms to display the message in Europe

when a user searches for most names, not just pages that have been affected by a removal.

It is a political gesture – one that reflects Google’s continuing annoyance at the 2014 decision – now forever known as “Google Spain” – of the Court of Justice of the European Union which established that Google is a data controller for the purpose of search returns containing personal data, and that it must consider requests from data subjects for removal of such personal data. A great deal has been written about this, some bad and some good (a lot of the latter contained in the repository compiled by Julia Powles and Rebekah Larsen) and I’m not going to try to add to that, but what I have noticed is that a lot of people see this “some results may have been removed” message, and become suspicious. For instance, this morning, I noticed someone tweeting to the effect that the message had come up on a search for “Chuka Umunna”, and their supposition was that this must relate to something which would explain Mr Umunna’s decision to withdraw from the contest for leadership of the Labour Party. A search on Twitter for “some results may have” returns a seething mass of suspicion and speculation.

Google is conducting an unnecessary exercise in innuendo. It could easily rephrase the message (“With any search term there is a possibility that some results may have been removed…”) but chooses not to do so, no doubt because it wants to undermine the effect of the CJEU’s ruling. It’s shoddy, and it drags wholly innocent people into its disagreement.

Furthermore, there is an argument that the exercise could be defamatory. I am not a lawyer, let alone a defamation lawyer, so I will leave it to others to consider that argument. However, I do know a bit about data protection, and it strikes me that, following Google Spain, Google is acting as a data controller when it processes a search on my name, and displays a list of results with the offending “some results may have been removed” message. As a data controller it has obligations, under European law (and UK law), to process my personal data “fairly and lawfully”. It is manifestly unfair, as well as wrong, to insinuate that information relating to me might have been removed under data protection law. Accordingly, I’ve written to Google, asking the message to be removed

Google UK Ltd
Belgrave House
76 Buckingham Palace Road
London SW1W 9TQ

16 May 2015

Dear Google

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [REDACTED]

With best wishes,
Jon Baines


The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with. Some words may have been removed under data protection law.


Filed under Data Protection, Europe

One for the Environmental Information Regulations + Data Protection nerds

In 2010 the Court of Justice of the European Union (CJEU) held that, insofar as they required the automatic publication of the name and other particulars of natural persons (as opposed to legal persons) of beneficiaries of funds deriving from the European Agricultural Guarantee Fund (EAGF) and the European Agricultural Fund for Rural Development (EAFRD), certain articles of European Council Regulation (EC) No 1290/2005 of 21 June 2005 on the financing of the common agricultural policy were invalid. This was because they imposed an obligation to publish personal data relating to these beneficiaries (who might be private individuals or sole traders) without permitting criteria such as the periods, frequency and amounts involved to be considered.

Rip-roaring start to a blog post eh?

In the words of the First-tier Tribunal (Information Rights) (FTT) which has recently had to consider the impact of those CJEU cases on an Environmental Information Regulations 2004 (EIR) case

[the CJEU] ruled that such a requirement for publication was incompatible with an individual’s right for privacy where the agreement holder concerned was a private individual or sole trade

The relevance of the European judgments was that Natural England, which had until 2010 published information about beneficiaries of funds granted to farmers and landowners under the European Stewardship Agreement (ESA), even when it consisted of personal data of private individual or sole trader beneficiaries, ceased such automatic publication and removed previously published information from its website. This was despite the fact applicants for an ESA had, until 2010, been given a privacy notice in a handbook which explained that the information would be published, and had signed a declaration accepting the requirements.

Notwithstanding this, when it received a request for agreements reached with farmers and landowners in the River Avon flood plains area, Natural England decided that the personal data of the beneficiary (there appears to have just been one) was exempt from disclosure under regulations 12(3) and 13 of the EIR (which broadly provide an exception to the general obligation under the EIR to disclose information if the information in question is personal data disclosure of which would be in breach of the public authority’s obligations under the Data Protection Act 1998 (DPA)).

The Information Commissioner’s Office had agreed, saying

although consent for disclosure has been obtained [by virtue of the applicant’s declaration of acceptance of the handbook’s privacy notice], circumstances have changed since that consent was obtained. As Natural England’s current practice is not to publish the names of those who have received grants with the amounts received, the Commissioner is satisfied that the expectation of the individuals concerned will be that their names and payments will not be made public.

However, the FTT was not convinced by this. Although it accepted that it was possible “that the applicant no longer expected the relevant personal data to be disclosed” it considered whether this would nevertheless be a reasonable expectation, and it also took into account that the effect of the CJEU’s decision had not been expressly to prohibit disclosure (but rather that the validity of automatic publication had been struck down):

When one combined the facts that an express consent had been given, that there had been no publicity by NE or mention on its website of the ECJ decision and finally, that the effect of that decision had not, in the event been to prohibit disclosure, [the FTT] concluded that such an expectation would not be reasonable

Furthermore, given that there was no real evidence that disclosure would cause prejudice or distress to the applicant, given that some identifying information had already been disclosed into the public domain and given that there was a legitimate interest – namely “accountability in the spending of public monies” – in the information being made public (and disclosure was necessary to meet this legitimate interest) the disclosure was both fair and supported by a permitting condition in Schedule 2 of the DPA. For these reasons, disclosure would not, said the FTT, breach Natural England’s obligation to process personal data fairly under the first data protection principle.

So maybe not the most ground-breaking of cases, but it is relatively rare that an FTT disagrees with the ICO and orders disclosure of personal data under the EIR (or FOI). The latter is, after all, the statutory regulator of the DPA, and its views on such matters will normally be afforded considerable weight by any subsequent appellate body.

Leave a comment

Filed under Data Protection, Environmental Information Regulations, Europe, Freedom of Information, Information Commissioner, Information Tribunal

Do bloggers need to register with the ICO?

A strict reading of data protection law suggests many (if not all) bloggers should register with the ICO, even though the latter disagrees. And, I argue, the proposal for an Information Rights Levy runs the risk of being notification under a different name

Part III of the Data Protection Act 1998 (DPA) gives domestic effect to Article 18 of the European Data Protection Directive (the Directive). It describes the requirement that data controllers notify the fact that they are processing personal data, and the details of that processing, to the Information Commissioner’s Office (ICO). It is, on one view, a rather quaint throwback to the days when processing of personal data was seen as an activity undertaken by computer bureaux (a term found in the predecessor Data Protection Act 1984). However, it is law which is very much in force, and processing personal data without a valid notification, in circumstances where the data controller had an obligation to notify, is a criminal offence (section 21(1) DPA). Moreover, it is an offence which is regularly prosecuted by the ICO (eleven such prosecutions so far this year).

These days, it is remarkably easy to find oneself in the position of being a data controller (“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”). There are, according to the ICO, more than 370,000 data controllers registered. Certainly, if you are a commercial enterprise which in any way electronically handles personal data of customers or clients it is almost inevitable that you will be a data controller with an obligation to register. The exemptions to registering are laid out in regulations, and are quite restrictive – they are in the main, the following (wording taken from the ICO Notification Handbook)

Data controllers who only process personal information for: staff administration (including payroll); advertising, marketing and public relations (in connection with their own business activity); and accounts and records.
Some not-for-profit organisations.
Maintenance of a public register.
Processing personal information for judicial functions.
Processing personal information without an automated system such
as a computer.
But there is one other, key exemption. This is not within the notification regulations, but at section 36 of the DPA itself, and it exempts personal data from the whole of the Act if it is
processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes)
Thus, if you, for instance, keep a record of your children’s medical histories on your home computer, you are not caught by any of the DPA (and not required to notify with the ICO).Where this becomes interesting (it does become interesting, honestly) is when the very expansive interpretation the ICO gives to this “domestic purposes exemption” is considered in view of the extent to which people’s domestic affairs – including recreational purposes – now take place in a more public sphere, whereby large amounts of information are happily published by individuals on social media. As I have written elsewhere, the Court of Justice of the European Union (CJEU) held in 2003, in the Lindqvist case, that the publishing of information on the internet could not be covered by the relevant domestic purposes exemption in the Directive. The ICO and the UK has, ever since, been in conflict with this CJEU authority, a point illustrated by the trenchant criticism delivered in the High Court in the judgment by Tugendhat J in The Law Society v Kordowski.

But I think there is a even more stark illustration of the implications of an expansive interpretation of the section 36 exemption, and I provide it. On this blog I habitually name and discuss identifiable individuals – this is processing of personal data, and I determine the purposes for which, and the manner in which, this personal data is processed. Accordingly, I become a data controller, according to the definitions at section 1(1) of the DPA. So, do I need to notify my processing with the ICO? The answer, according to the ICO, is “no”. They tell me

from the information you have provided it would be unlikely that you would be required to register in respect of your blogs and tweets
But I don’t understand this. I cannot see any exemption which applies to my processing – unless it is section 36. But in what way can I seriously claim that I am processing personal data only for my domestic (including recreational) purposes. Yes, blogging about information rights is partly a recreation to me (some might say that makes me odd) but I cannot pretend that I have no professional aims and purposes in doing so. Accordingly, the processing cannot only be for domestic purposes.I have asked the ICO to confirm what, in their view, exempts me from notification. I hope they can point me to something I have overlooked, because, firstly, anything that avoids my having to pay an annual notification fee of £35 would be welcome, and secondly, I find it rather uncomfortable to be on the receiving end of my own personal analysis that I’m potentially committing a criminal offence, even if the lead prosecutor assures me I’m not.

The point about the notification fee leads to me on to a further issue. As I say above, notification is in some ways rather quaint – it harks back to days when processing of personal data was a specific, discrete activity, and looks odd in a world where, with modern technology, millions of activities every day meet the definition of “processing personal data”. No doubt for these reasons, the concept of notification with a data protection authority is missing from the draft General Data Protection Regulation (GDPR) currently slouching its way through the European legislative process. However, a proposal by the ICO suggests that, at least in the domestic sphere, notification (in another guise), might remain under new law.The ICO, faced with the fact that its main funding stream (the annual notification fees from those 370,000-plus data controllers) would disappear if the GDPR is passed in its proposed form, is lobbying for an “information rights levy”. Christopher Graham said earlier this year

I would have thought  an information rights levy, paid for by public authorities and data controllers [is needed]. We would be fully accountable to Parliament for our spending.

and the fact that this proposal made its way into the ICO’s Annual Report  with Graham saying that Parliament needs to “get on with the task” of establishing the levy, suggests that it might well be something the Ministry of Justice agrees with. As the MoJ would be first in line to have make up the funding shortfall if a levy wasn’t introduced, it is not difficult to imagine it becoming a reality.

On one view, a levy makes perfect sense – a “tax” on those who process personal data. But looked at another way, it will potentially become another outmoded means of defining what a data controller is. One cannot imagine that, for instance, bloggers and other social media users will be expected to pay it, so it is likely that, in effect, those data controllers whom the ICO currently expects to notify will be those who are required to pay the levy. One imagines, also, that pour encorager les autres, it might be made a criminal offence not to pay the levy in circumstances where a data controller should pay it but fails to do so. In reality, will it just be a mirror-image of the current notification regime?

And will I still be analysing my own blogging as being processing that belongs to that regime, but with the ICO, for pragmatic, if not legally sound, reasons, deciding the opposite?

1 Comment

Filed under Data Protection, Directive 95/46/EC, Europe, GDPR, parliament

Google is not a library, Dr Cavoukian

The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.

For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.

Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:

A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…

…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.

Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court

(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).

I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say

A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.

Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).

I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)

Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.

The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?




Filed under Data Protection, Directive 95/46/EC, Europe, human rights, Privacy

Ticking off Neelie Kroes (sort of)

In which I take issue with the European Commission V-P about what the Consumer Rights Directive says about pre-ticked boxes

I found myself retweeting what I think was a rather misleading message from the Vice-President of the European Commission, Neelie Kroes. Her tweet said

You know those annoying “pre-ticked boxes” on shopping/travel websites? They’re banned in #EU from today

I thought this was very interesting, particularly in light of my recent post about the implying of consent to electronic marketing if people forget to untick such boxes. The EU press release itself does say at one point

Under the new EU rules…consumers can now rely on…A ban on pre-ticked boxes on the internet, as for example when they buy plane tickets

But, it earlier says

The new rules also ban…pre-ticked boxes on websites for charging additional payments (for example when buying plane tickets online)

The emphasis I’ve added in that last quote is crucial. What DIRECTIVE 2011/83/EU OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 25 October 2011 on consumer rights actually proscribes is the contractual binding of a consumer to any payment in addition to the original remuneration agreed on if

the trader has not obtained the consumer’s express consent but has inferred it by using default options which the consumer is required to reject in order to avoid the additional payment

 So, as the press release explains,

When shopping online –for example when buying a plane ticket – you may be offered additional options during the purchase process, such as travel insurance or car rental. These additional services may be offered through so-called pre-ticked boxes. Consumers are currently often forced to untick those boxes if they do not want these extra services. With the new Directive, pre-ticked boxes will be banned across the European Union.

I happen to think that that text should more properly say “With the new Directive, pre-ticked boxes of this sort will be banned across the European Union”.

So, no ban on pre-ticked boxes themselves, just on those which purport to bind a consumer to an additional payment under a contract.

The Directive has been implemented in the UK by  The Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013 and associated The Enterprise Act 2002 (Part 8 EU Infringements) Order 2013 the former of which says (at regulation 40)

Under a contract between a trader and a consumer, no payment is payable in addition to the remuneration agreed for the trader’s main obligation unless, before the consumer became bound by the contract, the trader obtained the consumer’s express consent.. There is no express consent (if there would otherwise be) for the purposes of this paragraph if consent is inferred from the consumer not changing a default option (such as a pre-ticked box on a website)

Having said all this, I do think it is interesting that clearly-defined concepts of “express consent” are making their way into European and domestic legislation. And in due course, we may even find that, for instance, electronic marketing will be restrained unless similarly clearly-defined express consent is given. But not just yet.

Update: Ms Kroes kindly replied to me, saying it’s difficult to get a message across in 140 characters. So true.





Leave a comment

Filed under Data Protection, Europe, marketing, PECR

Data Protection for Baddies

Should Chris Packham’s admirable attempts to expose the cruelties of hunting in Malta be restrained by data protection law? And who is protected by the data protection exemption for journalism?

I tend sometimes to lack conviction, but one thing I am pretty clear about is that I am not on the side of people who indiscriminately shoot millions of birds, and whose spokesman tries to attack someone by mocking their well-documented mental health problems. So, when I hear that the FNKF, the Maltese “Federation for Hunting and Conservation” has

presented a judicial protest against the [Maltese] Commissioner of Police and the Commissioner for Data Protection, for allegedly not intervening in “contemplated” or possible breaches of privacy rules

with the claim being that they have failed to take action to prevent

BBC Springwatch presenter Chris Packham [from] violating hunters’ privacy by “planning to enter hunters’ private property” and by posting his video documentary on YouTube, which would involve filming them without their consent

My first thought is that this is an outrageous attempt to manipulate European privacy and data protection laws to try to prevent legitimate scruting of activities which sections of society find offensive and unacceptable. It’s my first thought, and my lasting one, but it does throw some interesting light on how such laws can potentially be used to advance or support causes which might not be morally or ethically attractive. (Thus it was that, in 2009, a former BNP member was prosecuted under section 55 the UK Data Protection Act 1998 (DPA 1998) for publishing a list of party members on the internet. Those members, however reprehensible their views or actions, had had their sensitive personal data unlawfully processed, and attracted the protection of the DPA (although the derisory £200 fine the offender received barely served as a deterrent)).

I do not profess to being an expert in Maltese Data Protection law, but, as a member state of the European Union, Malta was obliged to implement Directive EC/95/46 on the Protection of Individuals with regard to the Processing of Personal Data (which it did in its Data Protection Act of 2001). The Directive is the bedrock of all European data protection law, generally containing minimum standards which member states must implement in domestic law, but often allowing them to legislate beyond those minimum standards.

It may well be that the activities of Chris Packham et al do engage Maltese data protection law. In fact, if, for instance, film footage or other information which identifies individuals is recorded and broadcast in other countries in the European Union, it would be likely to constitute an act of “processing” under Article 2(b) of the Directive which would engage data protection law in whichever member state it was processed.

Data protection law at European level has a scope whose potential breadth has been described as “breath-taking”. “Personal data” is “any information relating to an identified or identifiable natural person” (that is “one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity”), and “processing” encompasses “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction”.

However, the broad scope does not necessarily means broad prohibitions on activities involving processing. Personal data must be processed “fairly and lawfully”, and can (broadly) be processed without the data subject’s consent in circumstances where there is a legal obligation to do so, or where it is necessary in the public interest, or necessary where the legitimate interests of the person processing it, or of a third party, outweigh the interests for fundamental rights and freedoms of the data subject. These legitimising conditions are implemented into the Maltese Data Protection Act 2001 (at section 9), so it can be seen that the FKNF’s claim that Packham requires the hunters’ consent to film might not have legs.

Moreover, Article 9 of the Directive, transposed in part at section 6 of the 2001 Maltese Act, provides for an exemption to most of the general data protection obligations where the processing is for journalistic purposes, which almost certainly be engaged for Packham’s activities. Whether, however, any other Maltese laws might apply is, I’m afraid, well outside my area of knowledge.

But what about activists who might not normally operate under the banner of “journalism”? What if Packham were, rather than a BBC journalist/presenter, “only” a naturalist? Would he be able to claim the journalistic data protection exemption?

Some of these sorts of issues are currently edging towards trial in litigation brought in the UK, under the DPA 1998, by a mining corporation (or, in its own words, a “diversified natural resources business”), BSG Resources, against Global Witness, an NGO one of whose stated goals is to “expose the corrupt exploitation of natural resources and international trade systems”. BSGR’s claims are several, but are all made under the DPA 1998, and derive from the fact they have sought to make subject access requests to Global Witness to know what personal data of the BSGR claimants is being processed, for what purposes and to whom it is being or may be disclosed. Notably, BSGR have chosen to upload their grounds of claim for all to see. For more background on this see the ever-excellent Panopticon blog, and this article in The Economist.

This strikes me as a potentially hugely significant case, firstly because it illustrates how data protection is increasingly being used to litigate matters more traditionally seen as being in the area of defamation law, or the tort of misuse of private information, but secondly because it goes to the heart of questions about what journalism is, who journalists are and what legal protection (and obligations) those who don’t fit the traditional model/definition of journalism have or can claim.

I plan to blog in more detail on this case in due course, but for the time being I want to make an observation. Those who know me will not have too much trouble guessing on whose side my sympathies would tend to fall in the BSGR/Global Witness litigation, but I am not so sure how I would feel about extending journalism privileges to, say, an extremist group who were researching the activities of their opponents with a view to publishing those opponents’ (sensitive) personal data on the internet. If society wishes to extend the scope of protection traditionally afforded to journalists to political activists, or citizen bloggers, or tweeters, it needs to be very careful that it understands the implications of doing so. Freedom of expression and privacy rights coexist in a complex relationship, which ideally should be an evenly balanced one. Restricting the scope of data protection law, by extending the scope of the exemption for journalistic activities, could upset that balance.


Filed under Data Protection, Europe, human rights, journalism, Privacy, Uncategorized

The leaflet campaign – legally necessary?

Readers of this blog [sometimes I imagine them1] may well be fed up with posts about (see here, here and here). But this is my blog and I’ll cry if I want to. So…

Doyen of information rights bloggers, Tim Turner, has written in customary analytic detail on how the current NHS leafleting campaign was not necessitated by data protection law, and on how, despite some indications to the contrary, GPs will not be in the Information Commissioner’s firing line if they fail adequately to inform patients about what will be happening to their medical data.

He’s right, of course: where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) then it is not obliged, pace the otherwise very informative blogpost by the Information Commissioner’s Dawn Monaghan, to give data subjects a privacy, or fair processing notice.

(In passing, and in an attempt to outnerd the unoutnerdable, I would point out that Tim omits that, by virtue of The Data Protection (Conditions under Paragraph 3 of Part II of Schedule 1) Order 2000, if a data subject properly requests a privacy notice in circumstances where a data controller is subject to a legal obligation to disclose personal data (other than under a contract) and would, thus, otherwise not be required to issue one, the data controller must comply2.)

Tim says, though

The leaflet drop is no way to inform people about such a significant step, but I don’t think it is required

That appears to be true, under data protection law, but, under broader obligations imposed on the relevant authorities under Article 8 of the European Convention on Human Rights (ECHR), as incorporated in domestic law in the Human Rights Act 1998, it might not be so (and here, unlike with data protection law, we don’t have to consider the rigid controller/processor dichotomy in order to decide who the relevant, and liable, public authority is, and I would suggest that NHS England (as the “owner of the programme” in Dawn Monaghan’s words) seems the obvious candidate, but GPs might also be caught).

In 1997 the European Court of Human Rights addressed the very-long-standing concept of the confidentiality of doctor-patient relations, in the context of personal medical data, in Z v Finland (1997) 25 EHRR 371, and said

the Court will take into account that the protection of personal data, not least medical data, is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life as guaranteed by Article 8 of the Convention (art. 8). Respecting the confidentiality of health data is a vital principle in the legal systems of all the Contracting Parties to the Convention. It is crucial not only to respect the sense of privacy of a patient but also to preserve his or her confidence in the medical profession and in the health services in general…Without such protection, those in need of medical assistance may be deterred from revealing such information of a personal and intimate nature as may be necessary in order to receive appropriate treatment and, even, from seeking such assistance, thereby endangering their own health and, in the case of transmissible diseases, that of the community

This, I think, nicely encapsulates why so many good and deep-thinking people have fundamental concerns about

Now, I am not a lawyer, let alone a human rights lawyer, but it does occur to me that a failure to inform patients about what would be happening with their confidential medical records when GP’s were required to upload them, and a failure to allow them to opt-out, would have potentially infringed patients’ Article 8 rights. We should not forget that, initially, there was no intention to inform patients at all (there had no attempt to inform patients about the similar upload of hospital medical data, which has been going on for over twenty years). It is, surely, possible therefore, that NHS England is not just “helping” GPs to inform patients without having any responsibility to do so (as Dawn Monaghan suggests), but that it recognises its potential vulnerability to an Article 8 challenge, and is trying to avoid or mitigate this. Whether the leaflets themselves, and the campaign to deliver them, are adequate to achieve this aim is another matter. As has been noted, the leaflet contains no opt out form, and there seem to be numerous examples of people (often vulnerable people, for instance in care homes, or refuges) who will have little or no chance of receiving a copy.

At the launch of the tireless MedConfidential campaign last year, Shami Chakrabarti, of Liberty, spoke passionately about the potential human rights vulnerabilities of the programme. Notifying patients of what is proposed might not have been necessary under data protection law, but it is quite possible that the ECHR aspect of doing so was one of the things on which the Health and Social Care Information Centre (HSCIC) has been legally advised. Someone made an FOI request for this advice last year, and it is notable that HSCIC seem never to have completed their response to the request.

1I make no apologies for linking to one of Larkin’s most beautiful, but typically bleak and dystopian, pieces of prose, but I would add that it finishes “…These have I tried to remind of the excitement of jazz, and tell where it may still be found.”

2Unless the data controller does not have sufficient information about the individual in order readily to determine whether he is processing personal data about that individual, in which case the data controller shall send to the individual a written notice stating that he cannot provide the requisite information because of his inability to make that determination, and explaining the reasons for that inability


Filed under, Confidentiality, Data Protection, data sharing, Europe, human rights, Information Commissioner, NHS, Privacy

Will there be blood?

The First-tier Tribunal (Information Rights) (FTT) has overturned a decision by the Information Commissioner that the Northern Ireland Department for Health, Social Services and Public Safety (DHSSPS) should disclose advice received by the Minister of that Department from the Attorney General for Northern Ireland regarding a policy of insisting on a lifetime ban on males who have had sex with other males (“MSM”) donating blood.

On 11 October 2013 the Northern Ireland High Court handed down judgment in a judicial review application, challenging the decision of the Minister and the DHSSPS maintain the lifetime ban. The challenge arose because, in 20011, across the rest of the UK, the blanket ban which had existed since 1985 had been lifted.

DHSSPS lost the judicial review case, and lost relatively heavily: the decision of the Minister was unlawful for reasons that i) the Secretary of State, and not the Minister, by virtue of designation under the Blood Safety and Quality Regulations 2005, was responsible for whether to maintain or not the lifetime ban, ii) similarly, as (European) Community law dictated that this was a reserved matter (an area of government policy where the UK Parliament keeps the power to make legislate in Scotland, Northern Ireland and Wales), the decision was an act which was incompatible with Community law, iii) the Minister had taken a decision in breach of the Ministerial Code, by failing to refer the matter, under Section 20 of the Northern Ireland Act 1998, to the Executive Committee, and iv) although a ban in itself might have been defensible, the fact that blood was then imported from the rest of the UK (where the ban had been lifted) rendered the decision irrational.

Running almost concurrently with the judicial review proceedings was a request, made under the Freedom of Information Act 2000 (FOIA), for advice given to the Minister by the Attorney General for Ireland. The FOIA exemption, at section 42, for information covered by legal professional privilege (LPP) was thus engaged. The original decision notice by the Information Commissioner had rather surprisingly found that it was advice privilege, as opposed to litigation privilege. The IC correctly observed that for litigation privilege to apply

at the time of the creation of the information, there must have been a real prospect or likelihood of litigation occurring, rather than just a fear or possibility

and, because the information was dated October 2011, and leave for judicial review had not been sought until December 2011

at the time the information was created, ltigation was nothing more than a possibility

But one questions whether this can be correct, when one learns from the FTT judgment that DHSSPS had been sent a pre-action protocol letter on 27 September 2011. Again rather surprisingly, though, the FTT does not appear to have made a clear decision one way or the other which type of privilege applied, but its observation that

when the request was made judicial review proceedings…were already underway

would imply that they disagreed with the IC.

This discrepancy might lie behind the fact that the FTT afforded greater weight to the public interest in favour of maintaining the exemption. It was observed that

[the existence of the proceedings] at the time of the request seems to us to be an additional specific factor in favour of maintaining the exemption. It seems unfair that a public authority engaged in litigation should have a unilateral duty to disclose its legal advice [para 19]

Additionally, the fact that the advice was sought after the decision had been taken meant that it could give “no guide to the Minister’s motives or reasoning”.

Ultimately – and this is suggestive that the issue was finely balanced – it was the well-established inherent public interest in the maintenance of LPP which prevailed (para 21). This was a factor of “general importance” as found in a number of cases summarised by the Upper Tribunal in DCLG v The Information Commissioner and WR (2012) UKUT 103 (AAC).

Because the appeal succeeded on the grounds that the section 42 exemption applied, the FTT did not go on to consider the other exemptions pleaded by DHSSPS and the Attorney General – sections 35(1)(a) and 35(1)(c), although it was very likely that the latter at least would have also applied.

Aggregation of public interest factors

Because the other exemptions did not come into play, the FTT’s observation on the IC’s approach to public interest factors where more than one exemption applies are strictly obiter, but they are important nonetheless. As all good Information Rights people know, the European Court of Justice ruled in 2011, that when more than one exception applies to disclosure of information under the Environmental Information Regulations 20040 (EIR), the public authority may (not must)  weigh the public interest in disclosure against the aggregated weight of the public interest arguments for maintaining all the exceptions. The IC does not accept that this aggregation approach extends to FOIA, however (see para 73 of his EIR exceptions guidance) and this was reflected in his decision notice in this matter, which considered separately the public interest balance in respect of the two exemptions he took into account. He invited the FTT to take the same approach, but, said the FTT, had the need arisen, the IC would have needed to justify how this “piecemeal approach” tallied with the requirement at section 2(2)(b) of FOIA to consider “all the circumstances of the case”. Moreover, the effect of the IC’s differing approaches under EIR and FOIA means that

there will be a large number of cases in which public authorities, the ICO and the Tribunal will be required to make a sometimes difficult decision about which disclosure regime applies in order to find out how to conduct the public interest balancing exercise

I am not aware of anywhere that the IC has explained his reasoning that aggregation does not apply in FOIA, and it would be helpful to know, before the matter becomes litigated (as it surely will).

And I will just end this rather long and abstruse piece with two personal observations. Firstly, donating blood is simple, painless and unarguably betters society – anyone who can, should donate. Secondly, denying gay men the ability, in this way, to contribute to this betterment of society is absurd, illogical and smacks of bigotism.

Leave a comment

Filed under Environmental Information Regulations, Europe, Freedom of Information, Information Commissioner, Information Tribunal, Upper Tribunal