Tag Archives: data protection

Data Protection distress compensation for CCTV intrusion

The Information Commissioner’s Office (ICO) recently (2 February) successfully prosecuted a business owner for operating CCTV without an appropriate notification under section 18 of the Data Protection Act 1998 (DPA), announcing:

Businesses could face fines for ignoring CCTV data protection law

But a recent case in the Scottish Sheriff Court shows that CCTV and data protection can also have relevance in private law civil proceedings. In Woolley against Akbar [2017] ScotsSC 7 the husband and wife pursuers (equivalent to claimants in England and Wales) successfully brought a claim for compensation for distress caused by the defender’s (defendant in England and Wales) use of CCTV cameras which were continuously recording video and audio, and which were deliberately set to cover the pursuers’ private property (their garden area and the front of their home). Compensation was assessed at £8634 for each of the pursuers (so £17268 in total) with costs to be assessed at a later date.

Two things are of particular interest to data protection fans: firstly, the willingness of the court to rule unequivocally that CCTV operated in non-compliance with the DPA Schedule One principles was unlawful; and secondly, the award of compensation despite the absence of physical damage.

The facts were that Mr and Mrs Woolley own and occupy the upper storey of a dwelling place, while Mrs Akbar owns and operates the lower storey as a guest house, managed by her husband Mr Akram. In 2013 the relationship between the parties broke down. Although both parties have installed CCTV systems, the pursuers’ system only monitors their own property, but this was not the case with the defender’s:

any precautions to ensure that coverage of the pursuers’ property was minimised or avoided. The cameras to the front of the house record every person approaching the pursuers’ home. The cameras to the rear were set deliberately to record footage of the pursuers’ private garden area. There was no legitimate reason for the nature and extent of such video coverage. The nature and extent of the camera coverage were obvious to the pursuers, as they could see where the cameras were pointed. The coverage was highly intrusive…the defender also made audio recordings of the area around the pursuers’ property…they demonstrated an ability to pick up conversations well beyond the pursuers’ premises. There are four audio boxes. The rear audio boxes are capable of picking up private conversations in the pursuers’ rear garden. Mr Akram, on one occasion, taunted the pursuers about his ability to listen to them as the pursuers conversed in their garden. The defender and Mr Akram were aware of this at all times, and made no effort to minimise or avoid the said audio recording. The nature of the coverage was obvious to the pursuers. Two audio boxes were installed immediately below front bedroom windows. The pursuers feared that conversations inside their home could also be monitored. The said coverage was highly intrusive.

Although, after the intervention of the ICO, the defender realigned the camera at the rear of the property, Sheriff Ross held that the coverage “remains intrusive”. Fundamentally, the sheriff held that the CCTV use was: unfair (in breach of the first data protection principle); excessive in terms of the amount of data captured (in breach of the third data protection principle); and retained for too long (in breach of the fifth data protection principle).

The sheriff noted that, by section 13(2) of the DPA, compensation for distress can only be awarded if the pursuer has suffered “damage”, which was not the case here. However, the sheriff further correctly noted, and was no doubt taken to, the decision of the Court of Appeal in Vidal-Hall & Ors v Google [2015] EWCA Civ 311 in which the court struck down section 13(2) as being incompatible with the UK’s obligations under the European data protection directive and the Charter of Fundamental Rights (my take on Vidal Hall is here). Accordingly, “pure” distress compensation was available.

Although the facts here show a pretty egregious breach of DPA, it is good to see a court understanding and assessing the issues so well, no doubt assisted in doing so by Paul Motion, of BTO Solicitors, who appeared for the pursuers.

One niggle I do have is about the role of the ICO in all this: they were clearly apprised of the situation, and could surely have taken enforcement action to require the stopping of the CCTV (although admittedly ICO cannot make an award of compensation). It’s not clear to me why they didn’t.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under damages, Data Protection, Information Commissioner

Get rights right, gov.uk

Government page on subject access rights is not accurate

Right of access to data about oneself is recognised as a fundamental right (article 8(2) of the Charter of Fundamental Rights of the European Union*). Section 7 of the UK’s Data Protection Act 1998 (DPA) gives expression to this, and provides that as a general right individuals are entitled to be told whether someone else is processing their data, and why, and furthermore (in terms) to be given a copy of that data. The European General Data Protection Regulation retains and bolsters this right, and recognises its importance by placing it in the category of provisions non-compliance with which could result in an administrative fine for a data controller of up to €20m or 4% of turnover (whichever is higher).

So subject access is important, and this is reflected in the fact that it is almost certainly the most litigated of provisions of the DPA (a surprisingly under-litigated piece of legislation). Many data controllers need to commit significant resources to comply with it, and the Information Commissioner’s Office (ICO) produced a statutory code of practice on the subject in 2014.

But it is not an absolute right. The DPA explains that there are exemptions to the right where, for instance, compliance would be likely to prejudice the course of criminal justice, or national security, or, in the case of health and social care records, would be likely to cause serious harm to the data subject or another person. Additionally the DPA recognises that, where complying with a subject access request would involve disclosing information about another individual, the data controller should not comply unless that other person consents, or unless it “is reasonable in all the circumstances to comply with the request without the consent of the other individual” (section 7(4) DPA).

But this important caveat (the engagement of the parallel rights of third parties) to the right of subject access is something which is almost entirely omitted in the government’s own web guidance regarding access to CCTV footage of oneself. It says

The CCTV owner must provide you with a copy of the footage that you can be seen in. They can edit the footage to protect the identities of other people.

The latter sentence is true, and especially in the case where footage captures third parties it is often appropriate to take measures to mask their identities. But the first sentence is simply not true. And I think it is concerning that “the best place to find government services and information” (as gov.uk describes itself) is wrong in its description of a fundamental right.

A data controller (let’s ignore the point that a “CCTV owner” might not necessarily be the data controller) does not have an unqualified obligation to provide information in response to a subject access request. As anyone working in data protection knows, the obligation is qualified by a number of exemptions. The page does allude to one of these (at section 29 of the DPA):

They can refuse your request if sharing the footage will put a criminal investigation at risk

But there are others – and the ICO has an excellent resource explaining them.

What I don’t understand is why the gov.uk page fails to provide better (accurate) information, and why it doesn’t provide a link to the ICO site. I appreciate that the terms and condition of gov.uk make clear that there is no guarantee that information is accurate, but I think there’s a risk here that data subjects could gather unreasonable expectations of their rights, and that this could lead to unnecessary grievances or disputes with data controllers.

Gov.uk invite comments about content, and I will be taking up this invitation. I hope they will amend.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under Data Protection, Information Commissioner, subject access

A Massive Impact for the ICO?

[Edited to add: it is well worth reading the comments to this piece, especially the ones from Chris Pounder and Reuben Binns]

I needed a way to break a blogging drought, and something that was flagged up to me by a data protection colleague (thanks Simon!) provides a good opportunity to do so. It suggests that the drafting of the GDPR could lead to an enormous workload for the ICO.

The General Data Protection Regulation (GDPR) which entered into force on 24 May this year, and which will apply across the European Union from 25 May 2018, mandates the completion of Data Protection Impact Assessments (DPIAs) where indicated. Article 35 of the GDPR explains that

Where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data

In the UK (and indeed elsewhere) we already have the concept of “Privacy Impact Assessments“, and in many ways all that the GDPR does is embed this area of good practice as a legal obligation. However, it also contains some ancillary obligations, one of which is to consult the supervisory authority, in certain circumstances, prior to processing. And here is where I get a bit confused.

Article 36 provides that

The controller shall consult the supervisory authority prior to processing where a data protection impact assessment under Article 35 indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate the risk
[emphasis added]

A close reading of Article 36 results in this: if the data controller conducts a DPIA, and is of the view that if mitigating factors were not in place the processing would be high risk, it will have to consult supervisory authority (in the UK, the Information Commissioner’s Office (ICO)). This is odd: it effectively renders any mitigating measures irrelevant. And it appears directly to contradict what recital 84 says

Where a data-protection impact assessment indicates that processing operations involve a high risk which the controller cannot mitigate by appropriate measures in terms of available technology and costs of implementation, a consultation of the supervisory authority should take place prior to the processing [emphasis added]

So, the recital says the obligation to consult will arise where high risk is involved which the controller can’t mitigate, while the Article says the obligation will arise where high risk is involved notwithstanding any mitigation in place.

Clearly, the Article contains the specific legal obligation (the recital purports to set out the reason for the contents of the enacting terms), so the law will require data controllers in the UK to consult the ICO every time a DPIA identifies an inherently high risk processing activity, even if the data controller has measures in place fully to mitigate and contain the risk.

For example, let us imagine the following processing activity – collection of and storage of customer financial data for the purposes of fulfilling a web transaction. The controller might have robust data security measures in place, but Article 36 requires it to consider “what if those robust measures were not in place? would the processing be high risk?” To which the answer would have to be “yes” – because the customer data would be completely unprotected.

In fact, I would submit, if article 36 is given its plain meaning virtually any processing activity involving personal data, where there is an absence of mitigating measures, would be high risk, and create a duty to consult the ICO.

What this will mean in practice remains to be seen, but unless I am missing something (and I’d be delighted to be corrected if so), the GDPR is setting the ICO and other supervisory authorities up for a massive influx of work. With questions already raised about the ICO’s funding going forward, that is the last thing they are likely to need.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, GDPR, Information Commissioner

Blackpool Displeasure Breach, redux

Over a year ago I blogged about a tweet by a member of the Oyston family connected with Blackpool FC:

a fan replies to a news item about the club’s manager, and calls the Oyston family “wankers”. Sam Oyston responds by identifying the seat the fan – presumably a season-ticket holder – occupies, and implies that if he continues to be rude the ticket will be withdrawn

For the reasons in that post I thought this raised interesting, and potentially concerning, data protection issues, and I mentioned that the Information Commissioner’s Office (ICO) had powers to take action. It was one of (perhaps the) most read posts (showing, weirdly, that football is possibly more of interest to most people than data protection itself) and it seemed that some people did intend complaining to the ICO. So, recently, I made an FOI request to the ICO for any information held by them concerning Blackpool FC’s data protection compliance. This was the reply

We have carried out thorough searches of the information we hold and have identified one instance where a member of the public raised concerns with the ICO in September 2014, about the alleged processing of personal data by Blackpool FC.

We concluded that there was insufficient evidence to consider the possibility of a s55 offence under the Data Protection Act 1998 (the DPA), and were unable to make an assessment as the individual had not yet raised their concerns with Blackpool FC direct.  We therefore advised the individual to contact the Club and to come back to us if they were still concerned, however we did not hear from them again.  As such, no investigation took place, nor was any assessment made of the issues raised.

This suggests the ICO appears wrongly to consider itself unable to undertake section 42 assessments under the Data Protection Act 1998 unless the data subject has complained to the data controller – a stance strongly criticised by Dr David Erdos on this blog, and one which has the potential to put the data subject further in dispute with the data controller (as I can imagine could have happened here, with a family some of whose members are ready to sue to protect their reputation). It also suggests though that maybe people weren’t quite as interested as the page views suggested. Nonetheless, I am posting this brief update, because a few people asked about it.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner

When data security = national security

One of the options open to the Information Commissioner’s Office (ICO), when considering whether to take enforcement action under the Data Protection Act 1998 (DPA) is – as an alternative to such action – to invite an offending data controller to sign an “undertaking”, which will in effect informally commit it to taking, or desisting from, specified actions. An undertaking is a relatively common event (there have been fifty over the last year) – so much so that the ICO has largely stopped publicising them (other than uploading them to its website) – very rarely is there a press release or even a tweet.

There is a separate story to be explored about both ICO’s approach to enforcement in general, and to its approach to publicity, but I thought it was worth highlighting a rather remarkable undertaking uploaded to the ICO’s site yesterday. It appears that the airline Flybe reported itself to the ICO last November, after a temporary employee managed to scan another individual’s passport, and email it to his (the employee’s) personal email account. The employee in question was in possession of an “air side pass”. Such a pass allows an individual to work unescorted in restricted areas of airports and clearly implies a level of security clearance. The ICO noted, however, that

Flybe did not provide data protection training for all staff members who process personal data. This included the temporary member of staff involved in this particular incident…

This is standard stuff for DPA enforcement: lack of training for staff handling personal data will almost always land the data controller in hot water if something goes wrong. But it’s what follows that strikes me as remarkable

the employee accessed various forms of personal data as part of the process to issue air side passes to Flybe’s permanent staff. This data included copies of passports, banking details and some information needed for criminal record background checks. The Commissioner was concerned that such access had been granted without due consideration to carrying out similar background checks to those afforded to permanent employees. Given the nature of the data to which the temporary employee had access, the Commissioner would have expected the data controller to have had some basic checking controls in place.

Surely this raises concerns beyond the data protection arena? Data protection does not exist in isolation from a broader security context. If it was really the case that basic checking controls were not in place regarding Flybe’s temporary employees and data protection, might it raise concerns about how that impacts on national security?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner, national security, undertaking

Anti-EU campaign database – in contravention of data protection laws?

The politics.co.uk site reports that an anti-EU umbrella campaign called Leave.EU (or is it theknow.eu?) has been written to by the Information Commissioner’s Office (ICO) after allegedly sending unsolicited emails to people who appear to have been “signed up” by friends or family. The campaign’s bank-roller, UKIP donor Aaron Banks, reportedly said

We have 70,000 people registered and people have been asked to supply 10 emails of friends or family to build out (sic) database

Emails sent to those signed up in this way are highly likely to have been sent in breach of the campaign’s obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), and the ICO is reported to have to written to the campaign to

inform them of their obligations under the PECR and to ask them to suppress [the recipient’s] email address from their databases

But is this really the main concern here? Or, rather, should we (and the ICO) be asking what on earth is a political campaign doing building a huge database of people, and identifying them as (potential) supporters without their knowledge? Such concerns go to the very heart of modern privacy and data protection law.

Data protection law’s genesis lie, in part, in the desire, post-war, of European nations to ensure “a foundation of justice and peace in the world”, as the preamble to the European Convention on Human Rights states. The first recital to the European Community Data Protection Directive of 1995 makes clear that the importance of those fundamental rights to data protection law.

The Directive is, of course, given domestic effect by the Data Protection Act 1998 (DPA). Section 2 of the same states that information as to someone’s political beliefs is her personal data: I would submit that presence on a database purporting to show that someone supports the UK”s withdrawal from the European Union is also her personal data. Placing someone on that database, without her knowledge or ability to object, will be manifestly “unfair” when it comes to compliance with the first data protection principle. It may also be inaccurate, when it comes to compliance with the fourth principle.

I would urge the ICO to look much more closely at this – the compiling of (query inaccurate) of secret databases of people’s political opinions has very scary antecedents.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, Data Protection, Directive 95/46/EC, Europe, human rights, Information Commissioner

Big Brother is misleading you

The best books… are those that tell you what you know already…

Big Brother Watch (BBW) is a campaigning organisation, a spin-off from the right-wing lobby group The Taxpayers’ Alliance, described as a “poorly disguised Conservative front”, a large part of whose funds come “from wealthy donors, many of whom are prominent supporters of the Conservative party“. To an extent, that doesn’t matter to me: BBW has done a lot to highlight privacy issues which chime with some of my own concerns – eg excessive use of CCTV, biometrics in schools – but regularly they rail against local authority “databreaches” in a way I think is both unhelpful and disingenuous.

The latest example is a report issued this week (on 11th August 2015) entitled “A Breach of Trust – how local authorities commit 4 data breaches every day”. Martin Hoskins has already done an excellent job in querying and critiquing the findings

At first glance, it looks impressive. It’s almost 200 pages long. But, and this is a big but, there are only a few pages of analysis – once you get past page 12, a series of annexes contain the responses from each local authority, revealing how minor the vast majority of the reported incidents (occurring between April 2011 and April 2014) actually were.

BBW started work on this report by submitting FOI requests to each local authority in June 2014. Quite why it has taken so to publish the results, bearing in mind that FOI requests should be returned within 20 days, is beyond me. Although BBW claims to have received a 98% response rate, some 212 authorities either declined to provide information, or claimed that they had experienced no data breaches between 2011 and 2014.

But plenty of media outlets have already uncritically picked the report up and run stories such as the BBC’s “Council data security ‘shockingly lax'” and the Mail’s “Councils losing personal data four times a day”. Local news media also willingly ran stories about their local councils’ data.

However, my main criticism of this BBW report is a fundamental one: their methodology was so flawed that the results are effectively worthless. Helpfully, although at the end of the report, they outline that methodology:

A Freedom of Information request was sent to all local authorities beginning on the 9th June 2014.

We asked for the number of individuals that have been convicted for breaking the Data Protection Act, the number that had had their employment terminated as the result of a DPA breach, the number that were disciplined internally, the number that resigned during proceedings and the number of instances where no action was taken.

The FOI request itself asked for

a list of the offences committed by the individual in question

The flaw is this: individuals within an organisation can not, in general terms “break” or “breach” the Data Protection Act 1998 (DPA). An employee is a mere agent of his or her employer, and under the DPA the legal person with the general obligations and liabilities is the “data controller”: an employee of an organisation does not have any real status under the DPA – the employer will be the “person who determines the purposes for which and the manner in which personal data are processed”, that is, the data controller. An individual employee could, in specific terms, “break” or “breach” the DPA but only if they committed an offence under section 55, of unlawfully obtaining etc. personal data without the consent of the data controller. There is a huge amount of confusion, and sloppy thinking, when it comes to what is meant by a data protection “breach”, but the vast majority of the incidents BBW report on are simply incidents in which personal data has been compromised by the council in question as data controller. No determination of whether the DPA was actually contravened will have been made (if only because the function of determining whether the Act has been contravened is one which falls to the Information Commissioner’s Office, or the police, or the courts). And if BBW wanted a list of offences committed, that list would be tiny.

To an extent, therefore, those councils who responded with inaccurate information are to blame. FOI practitioners are taught (when they are well taught) to read a request carefully, and where there is uncertainty or ambiguity, to seek clarification from the requester. In this instance, I did in fact advise one local authority to do so. Regrettably, rather than clarifying their request, BBW chose not to respond, and the council is listed in the report as “no response received”, which is both unfair and untrue.

I am not saying that data security and data protection in councils is not an area of concern. Indeed, I am sure that in some places it is lax. But councils deal with an enormous amount of sensitive personal data, and mistakes and near misses will sometimes happen. Councils are encouraged to (and should be applauded for) keeping registers of such incidents. But they shouldn’t disclose those registers in response to ill-informed and badly worded FOI requests, because the evidence here is that they, and the facts, will be misleadingly represented in order to fit a pre-planned agenda.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, Freedom of Information

Dear Google…Dear ICO…

On 15 June this year I complained to Google UK. I have had no response, so I have now asked the Information Commissioner’s Office to assess the lawfulness of Google’s actions. This is my email to the ICO

Hi

I would like to complain about Google UK. On 15 June 2015 I wrote to them at their registered address in the following terms

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [redacted]”

I have had no response to this letter, and furthermore I have twice contacted Google UK’s twitter account “@googleuk” to ask about a response, but have had none.

I am now asking, pursuant to my right to do so at section 42 of the Data Protection Act 1998, for you to conduct an assessment as to whether it is likely or unlikely that the processing by Google UK has been or is being carried out in compliance with the provisions of that Act.

I note that in Case C‑131/12 the Grand Chamber of the Court of Justice of the European Union held that “when the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State” then “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State”. I also note that Google UK’s notification to your offices under section 18 of the Data Protection Act 1998 says “We process personal information to enable us to promote our goods and services”. On this basis alone I would submit that Google UK is carrying out processing as a data controller in the UK jurisdiction.

I hope I have provided sufficient information for you to being to assess Google UK’s compliance with its obligations under the Data Protection Act 1998, but please contact me if you require any further information.

with best wishes,

Jon Baines

Leave a comment

Filed under Data Protection, Information Commissioner

What does it take to stop Lib Dems spamming?

Lib Dems continue to breach ePrivacy law, ICO still won’t take enforcement action.

It’s not difficult: the sending of unsolicited marketing emails to me is unlawful. Regulation 22 of The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) and by extension, the first and second principles in Schedule One of the Data Protection Act 1998 (DPA) make it so. The Liberal Democrats have engaged in this unlawful practice – they know and the Information Commissioner’s Office (ICO) know it, because the latter recently told the former that they have, and told me in turn

I have reviewed your correspondence and the [Lib Dem’s] website, and it appears that their current practices would fail to comply with the requirements of the PECR. This is because consent is not knowingly given, clear and specific….As such, we have written to the organisation to remind them of their obligations under the PECR and ensure that valid consent is obtained from individuals

But the ICO has chosen not to take enforcement action, saying to me in an email of 24th April

enforcement action is not taken routinely and it is our decision whether to take it. We cannot take enforcement action in every case that is reported to us

Of course I’d never suggested they take action in every case – I’d requested (as is my right under regulation 32 of PECR) that they take action in this particular case. The ICO also asked for the email addresses I’d used; I gave these over assuming it was for the purposes of pursuing an investigation but no, when I later asked the ICO they said they’d passed them to the Lib Dems in order that they could be suppressed from the Lib Dem mailing list. I could have done that if I wanted to. It wasn’t the point and I actually think the ICO were out of order (and contravening the DPA themselves) in failing to tell me that was the purpose.

But I digress. Failure to comply with PECR and the DPA is rife across the political spectrum and I think it’s strongly arguable that lack of enforcement action by the ICO facilitates this. And to illustrate this, I visited the Lib Dems’ website recently, and saw the following message

Untitled

Vacuous and vague, I suppose, but I don’t disagree, so I entered an email address registered to me (another one I reserve for situations where I fear future spamming) and clicked “I agree”. By return I got an email saying

Friend – Thank you for joining the Liberal Democrats…

Wait – hold on a cotton-picking minute – I haven’t joined the bloody Liberal Democrats – I put an email in a box! Is this how they got their recent, and rather-hard-to-explain-in-the-circumstances “surge” in membership? Am I (admittedly using a pseudonym) now registered with them as a member? If so, that raises serious concerns about DPA compliance – wrongly attributing membership of a political party to someone is processing of sensitive personal data without a legal basis.

It’s possible that I haven’t yet been registered as such, because the email went on to say

Click here to activate your account

When I saw this I actually thought the Lib Dems might have listened to the ICO – I assumed that if I didn’t (I didn’t) “click here” I would hear no more. Not entirely PECR compliant, but a step in the right direction. But no, I’ve since received an email from the lonely Alistair Carmichael asking me to support the Human Rights Act (which I do) but to support it by joining a Lib Dem campaign. This is direct marketing of a political party, I didn’t consent to it, and it’s sending was unlawful.

I’ll report it to the ICO, more in hope than expectation that they will do anything. But if they don’t, I think they have to accept that a continuing failure to take enforcement against casual abuse of privacy laws is going to lead to a proliferation of that abuse.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with..

2 Comments

Filed under consent, Data Protection, enforcement, Information Commissioner, marketing, PECR, spam

Google’s Innuendo

If you search on Google for my name, Jon Baines, or the full version, Jonathan Baines, you see, at the foot of the page of search results

Some results may have been removed under data protection law in Europe. Learn more

Oh-ho! What have I been up to recently? Well, not much really, and certainly nothing that might have led to results being removed under data protection law. Nor similarly, have John Keats, Eleanor Roosevelt and Nigel Molesworth (to pick a few names at random), a search on all of whose names brings up the same message. And, of course, if you click the hyperlink marked by the words “Learn more” you find out in fact that Google has simply set its algorithms to display the message in Europe

when a user searches for most names, not just pages that have been affected by a removal.

It is a political gesture – one that reflects Google’s continuing annoyance at the 2014 decision – now forever known as “Google Spain” – of the Court of Justice of the European Union which established that Google is a data controller for the purpose of search returns containing personal data, and that it must consider requests from data subjects for removal of such personal data. A great deal has been written about this, some bad and some good (a lot of the latter contained in the repository compiled by Julia Powles and Rebekah Larsen) and I’m not going to try to add to that, but what I have noticed is that a lot of people see this “some results may have been removed” message, and become suspicious. For instance, this morning, I noticed someone tweeting to the effect that the message had come up on a search for “Chuka Umunna”, and their supposition was that this must relate to something which would explain Mr Umunna’s decision to withdraw from the contest for leadership of the Labour Party. A search on Twitter for “some results may have” returns a seething mass of suspicion and speculation.

Google is conducting an unnecessary exercise in innuendo. It could easily rephrase the message (“With any search term there is a possibility that some results may have been removed…”) but chooses not to do so, no doubt because it wants to undermine the effect of the CJEU’s ruling. It’s shoddy, and it drags wholly innocent people into its disagreement.

Furthermore, there is an argument that the exercise could be defamatory. I am not a lawyer, let alone a defamation lawyer, so I will leave it to others to consider that argument. However, I do know a bit about data protection, and it strikes me that, following Google Spain, Google is acting as a data controller when it processes a search on my name, and displays a list of results with the offending “some results may have been removed” message. As a data controller it has obligations, under European law (and UK law), to process my personal data “fairly and lawfully”. It is manifestly unfair, as well as wrong, to insinuate that information relating to me might have been removed under data protection law. Accordingly, I’ve written to Google, asking the message to be removed

Google UK Ltd
Belgrave House
76 Buckingham Palace Road
London SW1W 9TQ

16 May 2015

Dear Google

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [REDACTED]

With best wishes,
Jon Baines

 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with. Some words may have been removed under data protection law.

7 Comments

Filed under Data Protection, Europe