Category Archives: Europe

Schrems II – what now?

A piece I have written with my Mishcon colleague Adam Rose, looking at the issues for businesses involved in international transfers (esp. to the US).

Make no mistake – the effect of Schrems II is to make bulk/regular transfers of personal data to the US problematic (putting it at its lowest). It arguably has the same effect in respect of transfers to most, if not all, third countries.

Leave a comment

Filed under adequacy, Data Protection, data security, Europe, facebook, GDPR, Information Commissioner, national security, privacy shield

Schrems II – this time it’s serious

As soon as judgment came out, my Mishcon de Reya colleague Adam Rose and I recorded our initial reactions to the CJEU’s decision in Schrems II. Here’s the link to the recording. Excuse my lockdown locks.

Some takeaways

  • The EU-US Privacy Shield arrangement for transferring personal data to the US is declared invalid.
  • Parties using Standard Contractual Clauses to transfer personal data from the EEA to countries outside must not do so if, in their assessment, the recipient country doesn’t provide an adequate level of protection. There must now be serious questions as to whether any transfers to the US can be valid.
  • The Binding Corporate Rules regime used by some of the world’s biggest international groups must now also be open to challenge.
  • Data Protection Authorities (such as the ICO) must intervene to stop transfers under SCCs which are made to countries without an adequate level of protection.
  • Post-Brexit UK may be seen as an attractive place for US companies to base operations, but there may well be further legal challenges to such arrangements.

Leave a comment

Filed under adequacy, Data Protection, Directive 95/46/EC, Europe, facebook, GDPR, Information Commissioner, Ireland, national security, privacy shield, surveillance

There’s nothing like consistency

A tale of two Member States, and two supervisory authorities.

First, the Belgium Data Protection Authority is reported to have fined a controller €50,000 for, among other infringements, appointing its director of audit, risk and compliance as its Data Protection Officer (DPO). This was – the DPA appears to have said – a conflict of  interest, and therefore an infringement of Article 38(6) of the General Data Protection Regulation (GDPR).

Second (and bearing in mind that all cases turn on their specific facts), one notes that, in the UK, the Data Protection Officer for the Information Commissioner’s Office (ICO), is its Head of Risk and Governance.

Let’s speculate –

Are the tasks of a Head of Risk and Governance likely to be similar to those of a director of audit, risk and compliance?

Would the Belgium DPA take the view that its UK equivalent is infringing GDPR, by appointing as DPO someone in circumstances which create a conflict of interest? (ICO notably says “[In respect of the combined roles of] DPO and Head of Risk and Governance, the tasks and focus of each role complement each other, and do not conflict. Neither responsibility is focused on determining the purposes and means of processing personal data but are both focused on providing advice about the risks, mitigations, safeguards and solutions required to ensure our processing is compliant and supported by our business decisions“).

What view would the European Data Protection Board take, if asked to consider the matter under the GDPR consistency mechanism (for instance on receipt of a request for an Opinion, under Article 64(2))?

Does it matter, given Brexit?

And if doesn’t matter immediately, might the status and position of the ICO’s DPO be one of the factors the European Commission might subsequently take into account, when deciding whether post-Brexit UK has an adequate level of protection, as a third country?

No answers folks, just questions.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adequacy, Brexit, consistency, Data Protection, Europe, GDPR, Information Commissioner

€9.5m GDPR fine to German telco for insecure customer authentication

Another post by me on the Mishcon de Reya website – federal telecoms regulator issues fine for Article 32 failings after callers could give customer name and d.o.b. and obtain further information.

Leave a comment

Filed under Data Protection, Europe, GDPR, monetary penalty notice

No direct liability under GDPR for representatives, says EDPB

I have a new post on the Mishcon de Reya website, drawing attention to a change from draft to agreed EDPB guidance which might make being a GDPR representative much more attractive.

Leave a comment

Filed under EDPB, EU representative, Europe, GDPR

Data protection and fake pornography

Wired’s Matt Burgess has written recently about the rise of fake pornography created using artificial intelligence software, something that I didn’t know existed (and now rather wish I hadn’t found out about):

A small community on Reddit has created and fine-tuned a desktop application that uses machine learning to morph non-sexual photos and transplant them seamlessly into pornographic videos.

The FacesApp, created by Reddit user DeepFakesApp, uses fairly rudimental machine learning technology to graft a face onto still frames of a video and string a whole clip together. To date, most creations are short videos of high-profile female actors.

The piece goes on to discuss the various potential legal restrictions or remedies which might be available to prevent or remove content created this way. Specifically within a UK context, Matt quotes lawyer Max Campbell:

“It may amount to harassment or a malicious communication,” he explains. “Equally, the civil courts recognise a concept of ‘false privacy’, that is to say, information which is false, but which is nevertheless private in nature.” There are also copyright issues for the re-use of images and video that wasn’t created by a person.

However, what I think this analysis misses is that the manipulation of digital images of identifiable individuals lands this sort of sordid practice squarely in the field of data protection. Data protection law relates to “personal data” –  information relating to an identifiable person – and “processing” thereof. “Processing” is (inter alia)

any operation…which is performed upon personal data, whether or not by automatic means, such as…adaptation or alteration…disclosure by transmission, dissemination or otherwise making available…

That pretty much seems to encapsulate the activities being undertaken here. The people making these videos would be considered data controllers (persons who determine the purposes and means of the processing), and subject to data protection law, with the caveat that, currently, European data protection law, as a matter of general principle, only applies to processing undertaken by controllers established in the European Union. (In passing, I would note that the exemption for processing done in the course of a purely personal or household activity would not apply to the extent that the videos are being distributed and otherwise made public).

Personal data must be processed “fairly”, and, as a matter of blinding obviousness, it is hard to see any way in which the processing here could conceivably be fair.

Whether victims of this odious sort of behaviour will find it easy to assert their rights, or bring claims, against the creators is another matter. But it does seem to me to be the case here, unlike in some other cases, that (within a European context/jurisdiction) data protection law potentially provides a primary initial means of confronting the behaviour.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Europe, fairness, Uncategorized

Anti-EU campaign database – in contravention of data protection laws?

The politics.co.uk site reports that an anti-EU umbrella campaign called Leave.EU (or is it theknow.eu?) has been written to by the Information Commissioner’s Office (ICO) after allegedly sending unsolicited emails to people who appear to have been “signed up” by friends or family. The campaign’s bank-roller, UKIP donor Aaron Banks, reportedly said

We have 70,000 people registered and people have been asked to supply 10 emails of friends or family to build out (sic) database

Emails sent to those signed up in this way are highly likely to have been sent in breach of the campaign’s obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and the ICO is reported to have to written to the campaign to

inform them of their obligations under the PECR and to ask them to suppress [the recipient’s] email address from their databases

But is this really the main concern here? Or, rather, should we (and the ICO) be asking what on earth is a political campaign doing building a huge database of people, and identifying them as (potential) supporters without their knowledge? Such concerns go to the very heart of modern privacy and data protection law.

Data protection law’s genesis lie, in part, in the desire, post-war, of European nations to ensure “a foundation of justice and peace in the world”, as the preamble to the European Convention on Human Rights states. The first recital to the European Community Data Protection Directive of 1995 makes clear that the importance of those fundamental rights to data protection law.

The Directive is, of course, given domestic effect by the Data Protection Act 1998 (DPA). Section 2 of the same states that information as to someone’s political beliefs is her personal data: I would submit that presence on a database purporting to show that someone supports the UK”s withdrawal from the European Union is also her personal data. Placing someone on that database, without her knowledge or ability to object, will be manifestly “unfair” when it comes to compliance with the first data protection principle. It may also be inaccurate, when it comes to compliance with the fourth principle.

I would urge the ICO to look much more closely at this – the compiling of (query inaccurate) of secret databases of people’s political opinions has very scary antecedents.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Directive 95/46/EC, Europe, human rights, Information Commissioner

Google’s Innuendo

If you search on Google for my name, Jon Baines, or the full version, Jonathan Baines, you see, at the foot of the page of search results

Some results may have been removed under data protection law in Europe. Learn more

Oh-ho! What have I been up to recently? Well, not much really, and certainly nothing that might have led to results being removed under data protection law. Nor similarly, have John Keats, Eleanor Roosevelt and Nigel Molesworth (to pick a few names at random), a search on all of whose names brings up the same message. And, of course, if you click the hyperlink marked by the words “Learn more” you find out in fact that Google has simply set its algorithms to display the message in Europe

when a user searches for most names, not just pages that have been affected by a removal.

It is a political gesture – one that reflects Google’s continuing annoyance at the 2014 decision – now forever known as “Google Spain” – of the Court of Justice of the European Union which established that Google is a data controller for the purpose of search returns containing personal data, and that it must consider requests from data subjects for removal of such personal data. A great deal has been written about this, some bad and some good (a lot of the latter contained in the repository compiled by Julia Powles and Rebekah Larsen) and I’m not going to try to add to that, but what I have noticed is that a lot of people see this “some results may have been removed” message, and become suspicious. For instance, this morning, I noticed someone tweeting to the effect that the message had come up on a search for “Chuka Umunna”, and their supposition was that this must relate to something which would explain Mr Umunna’s decision to withdraw from the contest for leadership of the Labour Party. A search on Twitter for “some results may have” returns a seething mass of suspicion and speculation.

Google is conducting an unnecessary exercise in innuendo. It could easily rephrase the message (“With any search term there is a possibility that some results may have been removed…”) but chooses not to do so, no doubt because it wants to undermine the effect of the CJEU’s ruling. It’s shoddy, and it drags wholly innocent people into its disagreement.

Furthermore, there is an argument that the exercise could be defamatory. I am not a lawyer, let alone a defamation lawyer, so I will leave it to others to consider that argument. However, I do know a bit about data protection, and it strikes me that, following Google Spain, Google is acting as a data controller when it processes a search on my name, and displays a list of results with the offending “some results may have been removed” message. As a data controller it has obligations, under European law (and UK law), to process my personal data “fairly and lawfully”. It is manifestly unfair, as well as wrong, to insinuate that information relating to me might have been removed under data protection law. Accordingly, I’ve written to Google, asking the message to be removed

Google UK Ltd
Belgrave House
76 Buckingham Palace Road
London SW1W 9TQ

16 May 2015

Dear Google

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [REDACTED]

With best wishes,
Jon Baines

 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with. Some words may have been removed under data protection law.

7 Comments

Filed under Data Protection, Europe

One for the Environmental Information Regulations + Data Protection nerds

In 2010 the Court of Justice of the European Union (CJEU) held that, insofar as they required the automatic publication of the name and other particulars of natural persons (as opposed to legal persons) of beneficiaries of funds deriving from the European Agricultural Guarantee Fund (EAGF) and the European Agricultural Fund for Rural Development (EAFRD), certain articles of European Council Regulation (EC) No 1290/2005 of 21 June 2005 on the financing of the common agricultural policy were invalid. This was because they imposed an obligation to publish personal data relating to these beneficiaries (who might be private individuals or sole traders) without permitting criteria such as the periods, frequency and amounts involved to be considered.

Rip-roaring start to a blog post eh?

In the words of the First-tier Tribunal (Information Rights) (FTT) which has recently had to consider the impact of those CJEU cases on an Environmental Information Regulations 2004 (EIR) case

[the CJEU] ruled that such a requirement for publication was incompatible with an individual’s right for privacy where the agreement holder concerned was a private individual or sole trade

The relevance of the European judgments was that Natural England, which had until 2010 published information about beneficiaries of funds granted to farmers and landowners under the European Stewardship Agreement (ESA), even when it consisted of personal data of private individual or sole trader beneficiaries, ceased such automatic publication and removed previously published information from its website. This was despite the fact applicants for an ESA had, until 2010, been given a privacy notice in a handbook which explained that the information would be published, and had signed a declaration accepting the requirements.

Notwithstanding this, when it received a request for agreements reached with farmers and landowners in the River Avon flood plains area, Natural England decided that the personal data of the beneficiary (there appears to have just been one) was exempt from disclosure under regulations 12(3) and 13 of the EIR (which broadly provide an exception to the general obligation under the EIR to disclose information if the information in question is personal data disclosure of which would be in breach of the public authority’s obligations under the Data Protection Act 1998 (DPA)).

The Information Commissioner’s Office had agreed, saying

although consent for disclosure has been obtained [by virtue of the applicant’s declaration of acceptance of the handbook’s privacy notice], circumstances have changed since that consent was obtained. As Natural England’s current practice is not to publish the names of those who have received grants with the amounts received, the Commissioner is satisfied that the expectation of the individuals concerned will be that their names and payments will not be made public.

However, the FTT was not convinced by this. Although it accepted that it was possible “that the applicant no longer expected the relevant personal data to be disclosed” it considered whether this would nevertheless be a reasonable expectation, and it also took into account that the effect of the CJEU’s decision had not been expressly to prohibit disclosure (but rather that the validity of automatic publication had been struck down):

When one combined the facts that an express consent had been given, that there had been no publicity by NE or mention on its website of the ECJ decision and finally, that the effect of that decision had not, in the event been to prohibit disclosure, [the FTT] concluded that such an expectation would not be reasonable

Furthermore, given that there was no real evidence that disclosure would cause prejudice or distress to the applicant, given that some identifying information had already been disclosed into the public domain and given that there was a legitimate interest – namely “accountability in the spending of public monies” – in the information being made public (and disclosure was necessary to meet this legitimate interest) the disclosure was both fair and supported by a permitting condition in Schedule 2 of the DPA. For these reasons, disclosure would not, said the FTT, breach Natural England’s obligation to process personal data fairly under the first data protection principle.

So maybe not the most ground-breaking of cases, but it is relatively rare that an FTT disagrees with the ICO and orders disclosure of personal data under the EIR (or FOI). The latter is, after all, the statutory regulator of the DPA, and its views on such matters will normally be afforded considerable weight by any subsequent appellate body.

Leave a comment

Filed under Data Protection, Environmental Information Regulations, Europe, Freedom of Information, Information Commissioner, Information Tribunal

Do bloggers need to register with the ICO?

A strict reading of data protection law suggests many (if not all) bloggers should register with the ICO, even though the latter disagrees. And, I argue, the proposal for an Information Rights Levy runs the risk of being notification under a different name

Part III of the Data Protection Act 1998 (DPA) gives domestic effect to Article 18 of the European Data Protection Directive (the Directive). It describes the requirement that data controllers notify the fact that they are processing personal data, and the details of that processing, to the Information Commissioner’s Office (ICO). It is, on one view, a rather quaint throwback to the days when processing of personal data was seen as an activity undertaken by computer bureaux (a term found in the predecessor Data Protection Act 1984). However, it is law which is very much in force, and processing personal data without a valid notification, in circumstances where the data controller had an obligation to notify, is a criminal offence (section 21(1) DPA). Moreover, it is an offence which is regularly prosecuted by the ICO (eleven such prosecutions so far this year).

These days, it is remarkably easy to find oneself in the position of being a data controller (“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”). There are, according to the ICO, more than 370,000 data controllers registered. Certainly, if you are a commercial enterprise which in any way electronically handles personal data of customers or clients it is almost inevitable that you will be a data controller with an obligation to register. The exemptions to registering are laid out in regulations, and are quite restrictive – they are in the main, the following (wording taken from the ICO Notification Handbook)

Data controllers who only process personal information for: staff administration (including payroll); advertising, marketing and public relations (in connection with their own business activity); and accounts and records.
Some not-for-profit organisations.
Maintenance of a public register.
Processing personal information for judicial functions.
Processing personal information without an automated system such
as a computer.
But there is one other, key exemption. This is not within the notification regulations, but at section 36 of the DPA itself, and it exempts personal data from the whole of the Act if it is
processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes)
Thus, if you, for instance, keep a record of your children’s medical histories on your home computer, you are not caught by any of the DPA (and not required to notify with the ICO).Where this becomes interesting (it does become interesting, honestly) is when the very expansive interpretation the ICO gives to this “domestic purposes exemption” is considered in view of the extent to which people’s domestic affairs – including recreational purposes – now take place in a more public sphere, whereby large amounts of information are happily published by individuals on social media. As I have written elsewhere, the Court of Justice of the European Union (CJEU) held in 2003, in the Lindqvist case, that the publishing of information on the internet could not be covered by the relevant domestic purposes exemption in the Directive. The ICO and the UK has, ever since, been in conflict with this CJEU authority, a point illustrated by the trenchant criticism delivered in the High Court in the judgment by Tugendhat J in The Law Society v Kordowski.

But I think there is a even more stark illustration of the implications of an expansive interpretation of the section 36 exemption, and I provide it. On this blog I habitually name and discuss identifiable individuals – this is processing of personal data, and I determine the purposes for which, and the manner in which, this personal data is processed. Accordingly, I become a data controller, according to the definitions at section 1(1) of the DPA. So, do I need to notify my processing with the ICO? The answer, according to the ICO, is “no”. They tell me

from the information you have provided it would be unlikely that you would be required to register in respect of your blogs and tweets
But I don’t understand this. I cannot see any exemption which applies to my processing – unless it is section 36. But in what way can I seriously claim that I am processing personal data only for my domestic (including recreational) purposes. Yes, blogging about information rights is partly a recreation to me (some might say that makes me odd) but I cannot pretend that I have no professional aims and purposes in doing so. Accordingly, the processing cannot only be for domestic purposes.I have asked the ICO to confirm what, in their view, exempts me from notification. I hope they can point me to something I have overlooked, because, firstly, anything that avoids my having to pay an annual notification fee of £35 would be welcome, and secondly, I find it rather uncomfortable to be on the receiving end of my own personal analysis that I’m potentially committing a criminal offence, even if the lead prosecutor assures me I’m not.

The point about the notification fee leads to me on to a further issue. As I say above, notification is in some ways rather quaint – it harks back to days when processing of personal data was a specific, discrete activity, and looks odd in a world where, with modern technology, millions of activities every day meet the definition of “processing personal data”. No doubt for these reasons, the concept of notification with a data protection authority is missing from the draft General Data Protection Regulation (GDPR) currently slouching its way through the European legislative process. However, a proposal by the ICO suggests that, at least in the domestic sphere, notification (in another guise), might remain under new law.The ICO, faced with the fact that its main funding stream (the annual notification fees from those 370,000-plus data controllers) would disappear if the GDPR is passed in its proposed form, is lobbying for an “information rights levy”. Christopher Graham said earlier this year

I would have thought  an information rights levy, paid for by public authorities and data controllers [is needed]. We would be fully accountable to Parliament for our spending.

and the fact that this proposal made its way into the ICO’s Annual Report  with Graham saying that Parliament needs to “get on with the task” of establishing the levy, suggests that it might well be something the Ministry of Justice agrees with. As the MoJ would be first in line to have make up the funding shortfall if a levy wasn’t introduced, it is not difficult to imagine it becoming a reality.

On one view, a levy makes perfect sense – a “tax” on those who process personal data. But looked at another way, it will potentially become another outmoded means of defining what a data controller is. One cannot imagine that, for instance, bloggers and other social media users will be expected to pay it, so it is likely that, in effect, those data controllers whom the ICO currently expects to notify will be those who are required to pay the levy. One imagines, also, that pour encorager les autres, it might be made a criminal offence not to pay the levy in circumstances where a data controller should pay it but fails to do so. In reality, will it just be a mirror-image of the current notification regime?

And will I still be analysing my own blogging as being processing that belongs to that regime, but with the ICO, for pragmatic, if not legally sound, reasons, deciding the opposite?

1 Comment

Filed under Data Protection, Directive 95/46/EC, Europe, GDPR, parliament