Another post by me on the Mishcon de Reya website – federal telecoms regulator issues fine for Article 32 failings after callers could give customer name and d.o.b. and obtain further information.
Category Archives: Europe
I have a new post on the Mishcon de Reya website, drawing attention to a change from draft to agreed EDPB guidance which might make being a GDPR representative much more attractive.
Wired’s Matt Burgess has written recently about the rise of fake pornography created using artificial intelligence software, something that I didn’t know existed (and now rather wish I hadn’t found out about):
A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.
The FacesApp, created by Reddit user DeepFakesApp, uses fairly rudimental machine learning technology to graft a face onto still frames of a video and string a whole clip together. To date, most creations are short videos of high-profile female actors.
The piece goes on to discuss the various potential legal restrictions or remedies which might be available to prevent or remove content created this way. Specifically within a UK context, Matt quotes lawyer Max Campbell:
“It may amount to harassment or a malicious communication,” he explains. “Equally, the civil courts recognise a concept of ‘false privacy’, that is to say, information which is false, but which is nevertheless private in nature.” There are also copyright issues for the re-use of images and video that wasn’t created by a person.
However, what I think this analysis misses is that the manipulation of digital images of identifiable individuals lands this sort of sordid practice squarely in the field of data protection. Data protection law relates to “personal data” – information relating to an identifiable person – and “processing” thereof. “Processing” is (inter alia)
any operation…which is performed upon personal data, whether or not by automatic means, such as…adaptation or alteration…disclosure by transmission, dissemination or otherwise making available…
That pretty much seems to encapsulate the activities being undertaken here. The people making these videos would be considered data controllers (persons who determine the purposes and means of the processing), and subject to data protection law, with the caveat that, currently, European data protection law, as a matter of general principle, only applies to processing undertaken by controllers established in the European Union. (In passing, I would note that the exemption for processing done in the course of a purely personal or household activity would not apply to the extent that the videos are being distributed and otherwise made public).
Personal data must be processed “fairly”, and, as a matter of blinding obviousness, it is hard to see any way in which the processing here could conceivably be fair.
Whether victims of this odious sort of behaviour will find it easy to assert their rights, or bring claims, against the creators is another matter. But it does seem to me to be the case here, unlike in some other cases, that (within a European context/jurisdiction) data protection law potentially provides a primary initial means of confronting the behaviour.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
The politics.co.uk site reports that an anti-EU umbrella campaign called Leave.EU (or is it theknow.eu?) has been written to by the Information Commissioner’s Office (ICO) after allegedly sending unsolicited emails to people who appear to have been “signed up” by friends or family. The campaign’s bank-roller, UKIP donor Aaron Banks, reportedly said
We have 70,000 people registered and people have been asked to supply 10 emails of friends or family to build out (sic) database
Emails sent to those signed up in this way are highly likely to have been sent in breach of the campaign’s obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and the ICO is reported to have to written to the campaign to
inform them of their obligations under the PECR and to ask them to suppress [the recipient’s] email address from their databases
But is this really the main concern here? Or, rather, should we (and the ICO) be asking what on earth is a political campaign doing building a huge database of people, and identifying them as (potential) supporters without their knowledge? Such concerns go to the very heart of modern privacy and data protection law.
Data protection law’s genesis lie, in part, in the desire, post-war, of European nations to ensure “a foundation of justice and peace in the world”, as the preamble to the European Convention on Human Rights states. The first recital to the European Community Data Protection Directive of 1995 makes clear that the importance of those fundamental rights to data protection law.
The Directive is, of course, given domestic effect by the Data Protection Act 1998 (DPA). Section 2 of the same states that information as to someone’s political beliefs is her personal data: I would submit that presence on a database purporting to show that someone supports the UK”s withdrawal from the European Union is also her personal data. Placing someone on that database, without her knowledge or ability to object, will be manifestly “unfair” when it comes to compliance with the first data protection principle. It may also be inaccurate, when it comes to compliance with the fourth principle.
I would urge the ICO to look much more closely at this – the compiling of (query inaccurate) of secret databases of people’s political opinions has very scary antecedents.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
If you search on Google for my name, Jon Baines, or the full version, Jonathan Baines, you see, at the foot of the page of search results
Some results may have been removed under data protection law in Europe. Learn more
Oh-ho! What have I been up to recently? Well, not much really, and certainly nothing that might have led to results being removed under data protection law. Nor similarly, have John Keats, Eleanor Roosevelt and Nigel Molesworth (to pick a few names at random), a search on all of whose names brings up the same message. And, of course, if you click the hyperlink marked by the words “Learn more” you find out in fact that Google has simply set its algorithms to display the message in Europe
when a user searches for most names, not just pages that have been affected by a removal.
It is a political gesture – one that reflects Google’s continuing annoyance at the 2014 decision – now forever known as “Google Spain” – of the Court of Justice of the European Union which established that Google is a data controller for the purpose of search returns containing personal data, and that it must consider requests from data subjects for removal of such personal data. A great deal has been written about this, some bad and some good (a lot of the latter contained in the repository compiled by Julia Powles and Rebekah Larsen) and I’m not going to try to add to that, but what I have noticed is that a lot of people see this “some results may have been removed” message, and become suspicious. For instance, this morning, I noticed someone tweeting to the effect that the message had come up on a search for “Chuka Umunna”, and their supposition was that this must relate to something which would explain Mr Umunna’s decision to withdraw from the contest for leadership of the Labour Party. A search on Twitter for “some results may have” returns a seething mass of suspicion and speculation.
Google is conducting an unnecessary exercise in innuendo. It could easily rephrase the message (“With any search term there is a possibility that some results may have been removed…”) but chooses not to do so, no doubt because it wants to undermine the effect of the CJEU’s ruling. It’s shoddy, and it drags wholly innocent people into its disagreement.
Furthermore, there is an argument that the exercise could be defamatory. I am not a lawyer, let alone a defamation lawyer, so I will leave it to others to consider that argument. However, I do know a bit about data protection, and it strikes me that, following Google Spain, Google is acting as a data controller when it processes a search on my name, and displays a list of results with the offending “some results may have been removed” message. As a data controller it has obligations, under European law (and UK law), to process my personal data “fairly and lawfully”. It is manifestly unfair, as well as wrong, to insinuate that information relating to me might have been removed under data protection law. Accordingly, I’ve written to Google, asking the message to be removed
Google UK Ltd
76 Buckingham Palace Road
London SW1W 9TQ
16 May 2015
Complaint under Data Protection Act 1998
When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:
Some results may have been removed under data protection law in Europe. Learn more
To the best of my knowledge, no results have in fact been removed.
The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.
It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.
Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [REDACTED]
With best wishes,
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with. Some words may have been removed under data protection law.
In 2010 the Court of Justice of the European Union (CJEU) held that, insofar as they required the automatic publication of the name and other particulars of natural persons (as opposed to legal persons) of beneficiaries of funds deriving from the European Agricultural Guarantee Fund (EAGF) and the European Agricultural Fund for Rural Development (EAFRD), certain articles of European Council Regulation (EC) No 1290/2005 of 21 June 2005 on the financing of the common agricultural policy were invalid. This was because they imposed an obligation to publish personal data relating to these beneficiaries (who might be private individuals or sole traders) without permitting criteria such as the periods, frequency and amounts involved to be considered.
Rip-roaring start to a blog post eh?
In the words of the First-tier Tribunal (Information Rights) (FTT) which has recently had to consider the impact of those CJEU cases on an Environmental Information Regulations 2004 (EIR) case
[the CJEU] ruled that such a requirement for publication was incompatible with an individual’s right for privacy where the agreement holder concerned was a private individual or sole trade
The relevance of the European judgments was that Natural England, which had until 2010 published information about beneficiaries of funds granted to farmers and landowners under the European Stewardship Agreement (ESA), even when it consisted of personal data of private individual or sole trader beneficiaries, ceased such automatic publication and removed previously published information from its website. This was despite the fact applicants for an ESA had, until 2010, been given a privacy notice in a handbook which explained that the information would be published, and had signed a declaration accepting the requirements.
Notwithstanding this, when it received a request for agreements reached with farmers and landowners in the River Avon flood plains area, Natural England decided that the personal data of the beneficiary (there appears to have just been one) was exempt from disclosure under regulations 12(3) and 13 of the EIR (which broadly provide an exception to the general obligation under the EIR to disclose information if the information in question is personal data disclosure of which would be in breach of the public authority’s obligations under the Data Protection Act 1998 (DPA)).
The Information Commissioner’s Office had agreed, saying
although consent for disclosure has been obtained [by virtue of the applicant’s declaration of acceptance of the handbook’s privacy notice], circumstances have changed since that consent was obtained. As Natural England’s current practice is not to publish the names of those who have received grants with the amounts received, the Commissioner is satisfied that the expectation of the individuals concerned will be that their names and payments will not be made public.
However, the FTT was not convinced by this. Although it accepted that it was possible “that the applicant no longer expected the relevant personal data to be disclosed” it considered whether this would nevertheless be a reasonable expectation, and it also took into account that the effect of the CJEU’s decision had not been expressly to prohibit disclosure (but rather that the validity of automatic publication had been struck down):
When one combined the facts that an express consent had been given, that there had been no publicity by NE or mention on its website of the ECJ decision and finally, that the effect of that decision had not, in the event been to prohibit disclosure, [the FTT] concluded that such an expectation would not be reasonable
Furthermore, given that there was no real evidence that disclosure would cause prejudice or distress to the applicant, given that some identifying information had already been disclosed into the public domain and given that there was a legitimate interest – namely “accountability in the spending of public monies” – in the information being made public (and disclosure was necessary to meet this legitimate interest) the disclosure was both fair and supported by a permitting condition in Schedule 2 of the DPA. For these reasons, disclosure would not, said the FTT, breach Natural England’s obligation to process personal data fairly under the first data protection principle.
So maybe not the most ground-breaking of cases, but it is relatively rare that an FTT disagrees with the ICO and orders disclosure of personal data under the EIR (or FOI). The latter is, after all, the statutory regulator of the DPA, and its views on such matters will normally be afforded considerable weight by any subsequent appellate body.
A strict reading of data protection law suggests many (if not all) bloggers should register with the ICO, even though the latter disagrees. And, I argue, the proposal for an Information Rights Levy runs the risk of being notification under a different name
Part III of the Data Protection Act 1998 (DPA) gives domestic effect to Article 18 of the European Data Protection Directive (the Directive). It describes the requirement that data controllers notify the fact that they are processing personal data, and the details of that processing, to the Information Commissioner’s Office (ICO). It is, on one view, a rather quaint throwback to the days when processing of personal data was seen as an activity undertaken by computer bureaux (a term found in the predecessor Data Protection Act 1984). However, it is law which is very much in force, and processing personal data without a valid notification, in circumstances where the data controller had an obligation to notify, is a criminal offence (section 21(1) DPA). Moreover, it is an offence which is regularly prosecuted by the ICO (eleven such prosecutions so far this year).
These days, it is remarkably easy to find oneself in the position of being a data controller (“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”). There are, according to the ICO, more than 370,000 data controllers registered. Certainly, if you are a commercial enterprise which in any way electronically handles personal data of customers or clients it is almost inevitable that you will be a data controller with an obligation to register. The exemptions to registering are laid out in regulations, and are quite restrictive – they are in the main, the following (wording taken from the ICO Notification Handbook)
Data controllers who only process personal information for: staff administration (including payroll); advertising, marketing and public relations (in connection with their own business activity); and accounts and records.
Some not-for-profit organisations.
Maintenance of a public register.
Processing personal information for judicial functions.
Processing personal information without an automated system suchas a computer.
processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes)
But I think there is a even more stark illustration of the implications of an expansive interpretation of the section 36 exemption, and I provide it. On this blog I habitually name and discuss identifiable individuals – this is processing of personal data, and I determine the purposes for which, and the manner in which, this personal data is processed. Accordingly, I become a data controller, according to the definitions at section 1(1) of the DPA. So, do I need to notify my processing with the ICO? The answer, according to the ICO, is “no”. They tell me
from the information you have provided it would be unlikely that you would be required to register in respect of your blogs and tweets
The point about the notification fee leads to me on to a further issue. As I say above, notification is in some ways rather quaint – it harks back to days when processing of personal data was a specific, discrete activity, and looks odd in a world where, with modern technology, millions of activities every day meet the definition of “processing personal data”. No doubt for these reasons, the concept of notification with a data protection authority is missing from the draft General Data Protection Regulation (GDPR) currently slouching its way through the European legislative process. However, a proposal by the ICO suggests that, at least in the domestic sphere, notification (in another guise), might remain under new law.The ICO, faced with the fact that its main funding stream (the annual notification fees from those 370,000-plus data controllers) would disappear if the GDPR is passed in its proposed form, is lobbying for an “information rights levy”. Christopher Graham said earlier this year
I would have thought an information rights levy, paid for by public authorities and data controllers [is needed]. We would be fully accountable to Parliament for our spending.
and the fact that this proposal made its way into the ICO’s Annual Report with Graham saying that Parliament needs to “get on with the task” of establishing the levy, suggests that it might well be something the Ministry of Justice agrees with. As the MoJ would be first in line to have make up the funding shortfall if a levy wasn’t introduced, it is not difficult to imagine it becoming a reality.
On one view, a levy makes perfect sense – a “tax” on those who process personal data. But looked at another way, it will potentially become another outmoded means of defining what a data controller is. One cannot imagine that, for instance, bloggers and other social media users will be expected to pay it, so it is likely that, in effect, those data controllers whom the ICO currently expects to notify will be those who are required to pay the levy. One imagines, also, that pour encorager les autres, it might be made a criminal offence not to pay the levy in circumstances where a data controller should pay it but fails to do so. In reality, will it just be a mirror-image of the current notification regime?
And will I still be analysing my own blogging as being processing that belongs to that regime, but with the ICO, for pragmatic, if not legally sound, reasons, deciding the opposite?
The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.
For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.
Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:
A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…
…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.
Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court
(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).
I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say
A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.
Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).
I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)
Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.
The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?
Should Chris Packham’s admirable attempts to expose the cruelties of hunting in Malta be restrained by data protection law? And who is protected by the data protection exemption for journalism?
I tend sometimes to lack conviction, but one thing I am pretty clear about is that I am not on the side of people who indiscriminately shoot millions of birds, and whose spokesman tries to attack someone by mocking their well-documented mental health problems. So, when I hear that the FNKF, the Maltese “Federation for Hunting and Conservation” has
presented a judicial protest against the [Maltese] Commissioner of Police and the Commissioner for Data Protection, for allegedly not intervening in “contemplated” or possible breaches of privacy rules
with the claim being that they have failed to take action to prevent
BBC Springwatch presenter Chris Packham [from] violating hunters’ privacy by “planning to enter hunters’ private property” and by posting his video documentary on YouTube, which would involve filming them without their consent
My first thought is that this is an outrageous attempt to manipulate European privacy and data protection laws to try to prevent legitimate scruting of activities which sections of society find offensive and unacceptable. It’s my first thought, and my lasting one, but it does throw some interesting light on how such laws can potentially be used to advance or support causes which might not be morally or ethically attractive. (Thus it was that, in 2009, a former BNP member was prosecuted under section 55 the UK Data Protection Act 1998 (DPA 1998) for publishing a list of party members on the internet. Those members, however reprehensible their views or actions, had had their sensitive personal data unlawfully processed, and attracted the protection of the DPA (although the derisory £200 fine the offender received barely served as a deterrent)).
I do not profess to being an expert in Maltese Data Protection law, but, as a member state of the European Union, Malta was obliged to implement Directive EC/95/46 on the Protection of Individuals with regard to the Processing of Personal Data (which it did in its Data Protection Act of 2001). The Directive is the bedrock of all European data protection law, generally containing minimum standards which member states must implement in domestic law, but often allowing them to legislate beyond those minimum standards.
It may well be that the activities of Chris Packham et al do engage Maltese data protection law. In fact, if, for instance, film footage or other information which identifies individuals is recorded and broadcast in other countries in the European Union, it would be likely to constitute an act of “processing” under Article 2(b) of the Directive which would engage data protection law in whichever member state it was processed.
Data protection law at European level has a scope whose potential breadth has been described as “breath-taking”. “Personal data” is “any information relating to an identified or identifiable natural person” (that is “one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity”), and “processing” encompasses “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction”.
However, the broad scope does not necessarily means broad prohibitions on activities involving processing. Personal data must be processed “fairly and lawfully”, and can (broadly) be processed without the data subject’s consent in circumstances where there is a legal obligation to do so, or where it is necessary in the public interest, or necessary where the legitimate interests of the person processing it, or of a third party, outweigh the interests for fundamental rights and freedoms of the data subject. These legitimising conditions are implemented into the Maltese Data Protection Act 2001 (at section 9), so it can be seen that the FKNF’s claim that Packham requires the hunters’ consent to film might not have legs.
Moreover, Article 9 of the Directive, transposed in part at section 6 of the 2001 Maltese Act, provides for an exemption to most of the general data protection obligations where the processing is for journalistic purposes, which almost certainly be engaged for Packham’s activities. Whether, however, any other Maltese laws might apply is, I’m afraid, well outside my area of knowledge.
But what about activists who might not normally operate under the banner of “journalism”? What if Packham were, rather than a BBC journalist/presenter, “only” a naturalist? Would he be able to claim the journalistic data protection exemption?
Some of these sorts of issues are currently edging towards trial in litigation brought in the UK, under the DPA 1998, by a mining corporation (or, in its own words, a “diversified natural resources business”), BSG Resources, against Global Witness, an NGO one of whose stated goals is to “expose the corrupt exploitation of natural resources and international trade systems”. BSGR’s claims are several, but are all made under the DPA 1998, and derive from the fact they have sought to make subject access requests to Global Witness to know what personal data of the BSGR claimants is being processed, for what purposes and to whom it is being or may be disclosed. Notably, BSGR have chosen to upload their grounds of claim for all to see. For more background on this see the ever-excellent Panopticon blog, and this article in The Economist.
This strikes me as a potentially hugely significant case, firstly because it illustrates how data protection is increasingly being used to litigate matters more traditionally seen as being in the area of defamation law, or the tort of misuse of private information, but secondly because it goes to the heart of questions about what journalism is, who journalists are and what legal protection (and obligations) those who don’t fit the traditional model/definition of journalism have or can claim.
I plan to blog in more detail on this case in due course, but for the time being I want to make an observation. Those who know me will not have too much trouble guessing on whose side my sympathies would tend to fall in the BSGR/Global Witness litigation, but I am not so sure how I would feel about extending journalism privileges to, say, an extremist group who were researching the activities of their opponents with a view to publishing those opponents’ (sensitive) personal data on the internet. If society wishes to extend the scope of protection traditionally afforded to journalists to political activists, or citizen bloggers, or tweeters, it needs to be very careful that it understands the implications of doing so. Freedom of expression and privacy rights coexist in a complex relationship, which ideally should be an evenly balanced one. Restricting the scope of data protection law, by extending the scope of the exemption for journalistic activities, could upset that balance.