Tag Archives: DPA

Blackpool Displeasure Breach, redux

Over a year ago I blogged about a tweet by a member of the Oyston family connected with Blackpool FC:

a fan replies to a news item about the club’s manager, and calls the Oyston family “wankers”. Sam Oyston responds by identifying the seat the fan – presumably a season-ticket holder – occupies, and implies that if he continues to be rude the ticket will be withdrawn

For the reasons in that post I thought this raised interesting, and potentially concerning, data protection issues, and I mentioned that the Information Commissioner’s Office (ICO) had powers to take action. It was one of (perhaps the) most read posts (showing, weirdly, that football is possibly more of interest to most people than data protection itself) and it seemed that some people did intend complaining to the ICO. So, recently, I made an FOI request to the ICO for any information held by them concerning Blackpool FC’s data protection compliance. This was the reply

We have carried out thorough searches of the information we hold and have identified one instance where a member of the public raised concerns with the ICO in September 2014, about the alleged processing of personal data by Blackpool FC.

We concluded that there was insufficient evidence to consider the possibility of a s55 offence under the Data Protection Act 1998 (the DPA), and were unable to make an assessment as the individual had not yet raised their concerns with Blackpool FC direct.  We therefore advised the individual to contact the Club and to come back to us if they were still concerned, however we did not hear from them again.  As such, no investigation took place, nor was any assessment made of the issues raised.

This suggests the ICO appears wrongly to consider itself unable to undertake section 42 assessments under the Data Protection Act 1998 unless the data subject has complained to the data controller – a stance strongly criticised by Dr David Erdos on this blog, and one which has the potential to put the data subject further in dispute with the data controller (as I can imagine could have happened here, with a family some of whose members are ready to sue to protect their reputation). It also suggests though that maybe people weren’t quite as interested as the page views suggested. Nonetheless, I am posting this brief update, because a few people asked about it.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Information Commissioner

Complaint about Google’s Innuendo, redux

Some time ago I complained to the Information Commissioner’s Office (ICO) about the innuendo carried in the message that Google serves with search results on most personal names: “Some results may have been removed under data protection law in Europe”. I had already complained to Google UK, and wrote about it here. Google UK denied any responsibility or liability, and referred me to their enormous, distant, parents at 1600 Amphitheatre Parkway. I think they were wrong to do so, in light of the judgment of the Court of Justice of the European Union in the Google Spain case C‑131/12, but I will probably pursue that separately.

However, section 42 of the Data Protection Act 1998 (DPA) allows me to ask the ICO to assess whether a data controller has likely or not complied with its obligations under the DPA. So that’s what I did (pointing out that a search on “Jon Baines” or “Jonathan Baines” threw up the offending message).

In her response the ICO case officer did not address the jurisdiction point which Google had produced, and nor did she actually make a section 42 assessment (in fairness, I had not specifically cited section 42). What she did say was this

As you know, the Court of Justice of the European Union judgement in May 2014 established that Google was a data controller in respect of the processing of personal data to produce search results. It is not in dispute that some of the search results do relate to you. However, it is also clear that some of them will relate to other individuals with the same name. For example, the first result returned on a search on ‘Jonathan Baines’ is ‘LinkedIn’, which says in the snippet that there are 25 professionals named Jonathan Baines, who use LinkedIn.

It is not beyond the realms of possibility that one or more of the other individuals who share your name have had results about them removed. We cannot comment on this. However, we understand that this message appears in an overwhelming majority of cases when searching on any person’s name. This is likely to be regardless of whether any links have actually been removed.

True, I guess. Which is why I’ve reverted with this clarification of my complaint:

If it assists, and to extend my argument and counter your implied question “which Jon Baines are we talking about?”, if you search < “Jon Baines” Information Rights and Wrongs > (where the search term is actually what lies between the < >) you will get a series of results which undoubtedly relate to me, and from which I can be identified. Google is processing my personal data here (that is unavoidable a conclusion, given the ruling by the Court of Justice of the European Union in “Google Spain” (Case C‑131/12)). The message “Some results may have been removed under data protection law in Europe” appears as a result of the processing of my personal data, because it does not appear on every search (for instance < prime minister porcine rumours > or < “has the ICO issued the cabinet office an enforcement notice yet” >). As a product of the processing of my personal data, I argue that the message relates to me, and constitutes my personal data. As it carries an unfair innuendo (unfair because it implies I might have asked for removal of search results) I would ask that you assess whether Google have or have not likely complied with their obligation under section 4(4) to comply with the first and fourth data protection principles. (Should you doubt the innuendo point, please look at the list of results on a Twitter search for “Some results may have been removed”).

Let’s hope this allows the ICO to make the assessment, without my having to consider whether I need to litigate against one of the biggest companies in world history.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

8 Comments

Filed under Data Protection, Information Commissioner

Big Brother is misleading you

The best books… are those that tell you what you know already…

Big Brother Watch (BBW) is a campaigning organisation, a spin-off from the right-wing lobby group The Taxpayers’ Alliance, described as a “poorly disguised Conservative front”, a large part of whose funds come “from wealthy donors, many of whom are prominent supporters of the Conservative party“. To an extent, that doesn’t matter to me: BBW has done a lot to highlight privacy issues which chime with some of my own concerns – eg excessive use of CCTV, biometrics in schools – but regularly they rail against local authority “databreaches” in a way I think is both unhelpful and disingenuous.

The latest example is a report issued this week (on 11th August 2015) entitled “A Breach of Trust – how local authorities commit 4 data breaches every day”. Martin Hoskins has already done an excellent job in querying and critiquing the findings

At first glance, it looks impressive. It’s almost 200 pages long. But, and this is a big but, there are only a few pages of analysis – once you get past page 12, a series of annexes contain the responses from each local authority, revealing how minor the vast majority of the reported incidents (occurring between April 2011 and April 2014) actually were.

BBW started work on this report by submitting FOI requests to each local authority in June 2014. Quite why it has taken so to publish the results, bearing in mind that FOI requests should be returned within 20 days, is beyond me. Although BBW claims to have received a 98% response rate, some 212 authorities either declined to provide information, or claimed that they had experienced no data breaches between 2011 and 2014.

But plenty of media outlets have already uncritically picked the report up and run stories such as the BBC’s “Council data security ‘shockingly lax'” and the Mail’s “Councils losing personal data four times a day”. Local news media also willingly ran stories about their local councils’ data.

However, my main criticism of this BBW report is a fundamental one: their methodology was so flawed that the results are effectively worthless. Helpfully, although at the end of the report, they outline that methodology:

A Freedom of Information request was sent to all local authorities beginning on the 9th June 2014.

We asked for the number of individuals that have been convicted for breaking the Data Protection Act, the number that had had their employment terminated as the result of a DPA breach, the number that were disciplined internally, the number that resigned during proceedings and the number of instances where no action was taken.

The FOI request itself asked for

a list of the offences committed by the individual in question

The flaw is this: individuals within an organisation can not, in general terms “break” or “breach” the Data Protection Act 1998 (DPA). An employee is a mere agent of his or her employer, and under the DPA the legal person with the general obligations and liabilities is the “data controller”: an employee of an organisation does not have any real status under the DPA – the employer will be the “person who determines the purposes for which and the manner in which personal data are processed”, that is, the data controller. An individual employee could, in specific terms, “break” or “breach” the DPA but only if they committed an offence under section 55, of unlawfully obtaining etc. personal data without the consent of the data controller. There is a huge amount of confusion, and sloppy thinking, when it comes to what is meant by a data protection “breach”, but the vast majority of the incidents BBW report on are simply incidents in which personal data has been compromised by the council in question as data controller. No determination of whether the DPA was actually contravened will have been made (if only because the function of determining whether the Act has been contravened is one which falls to the Information Commissioner’s Office, or the police, or the courts). And if BBW wanted a list of offences committed, that list would be tiny.

To an extent, therefore, those councils who responded with inaccurate information are to blame. FOI practitioners are taught (when they are well taught) to read a request carefully, and where there is uncertainty or ambiguity, to seek clarification from the requester. In this instance, I did in fact advise one local authority to do so. Regrettably, rather than clarifying their request, BBW chose not to respond, and the council is listed in the report as “no response received”, which is both unfair and untrue.

I am not saying that data security and data protection in councils is not an area of concern. Indeed, I am sure that in some places it is lax. But councils deal with an enormous amount of sensitive personal data, and mistakes and near misses will sometimes happen. Councils are encouraged to (and should be applauded for) keeping registers of such incidents. But they shouldn’t disclose those registers in response to ill-informed and badly worded FOI requests, because the evidence here is that they, and the facts, will be misleadingly represented in order to fit a pre-planned agenda.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under Data Protection, Freedom of Information

FOI, data protection and rogue landlords 

On 23rd July the Chartered Institute of Environmental Health (CIEH), in conjunction with the Guardian, published a database of landlords who have been convicted of offences under the Housing Act 2004. This showed, for example, that one landlord has been prosecuted seven times for issues relating to disrepair and poor state of properties rented out. It also showed apparent regional discrepancies regarding prosecutions, with some councils carrying out only one prosecution since 2006.

This public interest investigative journalism was, however not achieved without a fight: in September last year the information Commissioners office (ICO) issued a decision notice finding that the journalists request for this information had been correctly refused by the Ministry of Justice on the grounds that the information was sensitive personal data and disclosure under the Freedom of Information Act 2000 (FOIA) would contravene the MoJ’s obligations under the Data Protection Act 1998 (DPA). Section 40(2) of FOIA provides that information is exempt from disclosure under FOIA if disclosure would contravene any of the data protection principles in Schedule One of the DPA (it also provides that it would be exempt if disclosure would contravene section 10 of the DPA, but this is rarely invoked). The key data protection principle is the first, which says that personal data must be processed fairly and lawfully, and in particular that the processing must meet one of the conditions in Schedule Two, and also – for sensitive personal data – one of the conditions in Schedule Three.

The ICO, in its decision notice, after correctly determining that information about identifiable individuals (as opposed to companies) within the scope of the request was sensitive personal data (because it was about offences committed by those individuals) did not accept the requester’s submission that a Schedule Three condition existed which permitted disclosure. The only ones which could potentially apply – condition 1 (explicit consent) or condition 5 (information already made public by the individual) – were not engaged.

However, the ICO did not at the time consider the secondary legislation made under condition 10: the Data Protection (Processing of Sensitive Personal Data) Order 2000 provides further bases for processing of sensitive personal data, and, as the the First-tier Tribunal (Information Rights) (FTT) accepted upon appeal by the applicant, part 3 of the Schedule to that Order permits processing where the processing is “in the substantial public interest”, is in connection with “the commission by any person of any unlawful act” and is for journalistic purposes and is done with a “view to the publication of those data by any person and the data controller reasonably believes that such publication would be in the public interest”. In fairness to the ICO, this further condition was identified by them in their response to the appeal.

In this case, the information was clearly sought with a view to the future publication in the CIEH’s Magazine, “Environmental Health News” and the requester was the digital editor of the latter. This, the FTT decided, taken with the (objective) substantial public interest in the publication of the information, was sufficient to make disclosure under FOIA fair and lawful. In a passage (paras 28-30) worth quoting in full the FTT said

Unfit housing is a matter of major public concern and has a significant impact on the health of tenants.  The Housing Act is a key mechanism for local authorities to improve housing standards and protect the health of vulnerable tenants.  One mechanism for doing this is by means of prosecution, another is licensing schemes for landlords.  Local authorities place vulnerable families in accommodation outside their areas tenants seek accommodation, The publication of information about convictions under the Housing Act would be of considerable value to local authorities in discharge of their functions and assist prospective tenants and those assisting them in avoiding landlords with a history of breaches of the Housing Act.

The sanctions under the Housing Act are comparatively small and the  opprobrium of a conviction may well not rank with other forms of criminal misbehaviour, however the potential for harm to others from such activity is very great, the potential for financial benefit from the misbehaviour is also substantial.  Breaches of the Housing Act are economically motivated and what is proposed is a method of advancing the policy objective of the Housing Act by increasing the availability of relevant information to key actors in the rented housing market – the local authorities as regulator and purchaser and the tenants themselves.  Any impact on the data subjects will overwhelmingly be on their commercial reputations rather than more personal matters.

The Tribunal is therefore satisfied that not only is the disclosure of this information in the substantial public interest, but also any reasonably informed data controller with  knowledge of the social needs and the impact of such disclosure would so conclude.

It is relatively rare that sensitive personal data will be disclosed, or ordered to be disclosed, under FOIA, but it is well worth remembering the 2000 Order, particularly when it comes to publication or proposed publication of such data under public interest journalism.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with..

Leave a comment

Filed under Data Protection, Freedom of Information, Information Commissioner, Information Tribunal, journalism, Open Justice

Dear Google…Dear ICO…

On 15 June this year I complained to Google UK. I have had no response, so I have now asked the Information Commissioner’s Office to assess the lawfulness of Google’s actions. This is my email to the ICO

Hi

I would like to complain about Google UK. On 15 June 2015 I wrote to them at their registered address in the following terms

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [redacted]”

I have had no response to this letter, and furthermore I have twice contacted Google UK’s twitter account “@googleuk” to ask about a response, but have had none.

I am now asking, pursuant to my right to do so at section 42 of the Data Protection Act 1998, for you to conduct an assessment as to whether it is likely or unlikely that the processing by Google UK has been or is being carried out in compliance with the provisions of that Act.

I note that in Case C‑131/12 the Grand Chamber of the Court of Justice of the European Union held that “when the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State” then “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State”. I also note that Google UK’s notification to your offices under section 18 of the Data Protection Act 1998 says “We process personal information to enable us to promote our goods and services”. On this basis alone I would submit that Google UK is carrying out processing as a data controller in the UK jurisdiction.

I hope I have provided sufficient information for you to being to assess Google UK’s compliance with its obligations under the Data Protection Act 1998, but please contact me if you require any further information.

with best wishes,

Jon Baines

Leave a comment

Filed under Data Protection, Information Commissioner

ICO: Samaritans Radar failed to comply with Data Protection Act

I’ve written so much previously on this blog about Samaritans Radar, the misguided Twitter app launched last year by Samaritans, that I’d thought I wouldn’t have to do so again. However, this is just a brief update on the outcome of the investigation by the Information Commissioner’s Office (ICO) into whether Samaritans were obliged to comply with data protection law when running the app, and, if so, the extent to which they did comply.

To recap, the app monitored the timelines of those the user followed on Twitter, and, if certain trigger words or phrases were tweeted, would send an email alert to the user. This was intended to be a “safety net” so that potential suicidal cries for help were not missed. But what was missed by Samaritans was the fact the those whose tweets were being monitored in this way would have no knowledge of it, and that this could lead to a feeling of uncertainty and unease in some of the very target groups they sought to protect. People with mental health issues raised concerns that the app could actually drive people off Twitter, where there were helpful and supportive networks of users, often tweeting the phrases and words the app was designed to pick up.

Furthermore, questions were raised, by me and many others, about the legality of the app under data protection law. So I made a request to the ICO under the Freedom of Information Act for

any information – such as an assessment of legality, correspondence etc. – which you hold about the “Samaritans Radar” app which Samaritans recently launched, then withdrew in light of serious legal and ethical concerns being raised

After an initial refusal because their investigation was ongoing, the ICO have now disclosed a considerable amount of information. Within it, however, is the substantive assessment I sought, in the form of a letter from the Group Manager for Government and Society to Samaritans. I think it is important to post it in full, and I do so below. I don’t have much to add, other than it vindicates the legal position put forward at the time by me and others (notably Susan Hall and Tim Turner).

19 December 2014

Samaritans Radar app

Many thanks for coming to our office and explaining the background to the development of the Radar application and describing how it worked.  We have now had an opportunity to consider the points made at the  meeting, as well as study the information provided in earlier  teleconferences and on the Samaritans’ website. I am writing to let you know our conclusions on how the Data Protection Act applies to the Radar  application.

We recognise that the Radar app was developed with the best of intentions and was withdrawn shortly after its launch but, as you know, during its operation we received a number of queries and concerns about the application. We have been asked for our vtew on whether personal data was processed in compliance with data protection prlnciples and whether the Samaritans are data controllers. You continue to believe that you are not data controllers or that personal data has been processed so I am writing to explain detail our conclusions on these points.

Personal data

Personal data is data that relates to an identifiable living individual. It is  our well-established position that data which identifies an individual, even without a name associated with it, may be personal data where it is processed to learn or record something about that individual, or where the processing of that information has an impact upon that individual. According to the information you have provided, the Radar app was a web-based application that used a specially designed algorithm that searched for specific keywords within the Twitter feeds of subscribers to the Radar app. When words indicating distress were detected within a Tweet, an email alert was automatically sent from the Samaritans to the subscriber saying Radar had detected someone they followed who may be going through a tough time and provided a link to that individual’s Tweet. The email asked the subscriber whether they were worried about the Tweet and if yes, they were re-directed to the Samaritans’ website for guidance on the best way of providing support to a follower who may be distressed. According to your FAQs, you also stored Twitter User IDs, Twitter User friends’ IDs, all tagged Tweets including the raw data associated with it and a count of flags against an individual Twitter user’s friends’ ID. These unique identifiers are personal data, in that they can easily be linked back to identifiable individuals.

Based on our understanding of how the application worked, we have reached the conclusion that the Radar service did involve processing of personal data. It used an algorithm to search for words that triggered an automated decision about an individual, at which point it sent an email alert to a Radar subscriber. It singled out an individual’s data with the purpose of differentiating them and treating them differently. In addition, you also stored information about all the Tweets that were tagged.

Data controller

We are aware of your view that you “are neither the data controller nor data processor of the information passing through the app”.

The concept of a data controller is defined in section 1 of the Data Protection 1998 (the DPA) as

“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”

We have concluded that the Radar service has involved processing of personal data. We understand that you used the agency [redacted] to develop and host the application. We are not fully aware of the role of [redacted] but given your central role in setting up and promoting the Radar application, we consider that the Samaritans have determined the manner and purpose of the processing of this personal data and as such you are data controllers. If you wish to be reminded of the approach we take in this area you may find it helpful to consult our guidance on data controllers and data processors. Here’s the link: https://ico.org.uk/media/about-the-ico/documents/1042555/data-controllers-and-data-processors-dp-guidance.pdf

Sensitive personal data

We also discussed whether you had processed sensitive personal data. You explained that the charity did deal with people seeking help for many different reasons and the service was not aimed at people with possible mental health issues. However the mission of the Samaritans is to alleviate emotional distress and reduce the incidence of suicide feelings and suicidal behaviours. In addition, the stated aims of the Radar project, the research behind it and the information provided in the FAQs all emphasise the aim of helping vulnerable peopie online and using the app to detect someone who is suicidal. For example, you say “research has shown there is a strong correlation between “suicidal tweets” and actual suicides and with Samaritans Radar we can turn a social net into a safety net”. Given the aims of the project, it is highly likely that some of the tweets identified to subscribers included information about an
individual’s mental health or other medical information and therefore would have been sensitive personal data.

At our meetings you said that even if you were processing sensitive personal data then Schedule 3 condiüon 5 (“The information contained in the personal data has been made public as a result of steps deliberately taken by the data subject”) was sufficient to legitimise this processing. Our guidance in our Personal Information Online Code of Practice makes it clear that although people post personal information in a way that becomes publicly visible, organisations still have an overarching duty to handle it fairly and to comply with the rules of data protection. The Samaritans are well respected in this field and receiving an email from your organisation carries a lot of weight. Linking an individual’s tweet to an email alert from the Samaritans is unlikely to be perceived in the same light as the information received in the original Tweet — not least because of the risk that people’s tweets were flagged when they were not in any distress at all.

Fair processing

Any processing of personal data must be fair and organisations must consider the effect of the processing on the individuals concerned and whether the processing would be within their reasonable expectations. You indicated that although you had undertaken some elements of an impact assessment, you had not carried out a full privacy impact assessment. You appear to have reached the conclusion that since the Tweets were publicly visible, you did not need to fully consider the privacy risks. For example, on your website you say that “all the data is public, so user privacy is not an issue. Samaritans Radar analyses the Tweets of people you follow, which are public Tweets. It does not look at private Tweets.”

It is our view that if organisations collect information from the internet and use it in a way that’s unfair, they could still breach the data protection principles even though the information was obtained from a publicly available source. It is particularly important that organisations should consider the data protection implications if they are planning to use analytics to make automated decisions that could have a direct effect on individuals. Under section 12 Of the Data Protection Act, individuals have certain rights to prevent decisions being taken about them that are solely based on automated processing of their personal data. The quality of the data being used as a basis for these decisions may also be an issue.

We note that the application was a year in development and that you used leading academics in linguistics to develop your word search algorithm. You also tested the application on a large number of people, although, as we discussed, most if not of these were connected to the project in some way and many were enthusiastic to see the project succeed. As our recent paper on Big Data explains, it is not so much a question of whether the data accurately records what someone says but rather to what extent that information provides a reliable basis for drawing conclusions. Commentators expressed concern at the apparent high level of false positives involving the Radar App (figures in the media suggest only 4% of email alerts were genuine). This raises questions about whether a System operating with such a low success rate could represent fair processing and indicates that many Tweets were being flagged up unnecessarily.

Since you did not consider yourselves to be data controllers, you have not sought the consent of, or provided fair processing notices to, the individuals whose Tweets you flagged to subscribers. It seems unlikely that it would be within people’s reasonable expectations that certain words and phrases from their Tweets would trigger an automatic email alert from the Samaritans saying Radar had detected someone who may be going throuqh a tough time. Our Personal Information Online Code of Practice says it is good practice to only use publicly available information in a way that is unlikely to cause embarrassment, distress or anxiety to the individual concerned. Organisations should only use their information in a way they are likely to expect and to be comfortable with. Our advice is that if in doubt about this, and you are unable to ask permission, you should not collect their information in the first place.

Conclusion

Based on our observations above, we have reached the conclusion that the Radar application did risk causing distress to individuals and was unlikely to be compliant with the Data Protection Act.

We acknowledge that the Samaritans did take responsibility for dealing with the many concerns raised about the application very quickly. The application was suspended on 7 November and we welcomed [redacted] assurances on 14 November that not only was the application suspended but it would not be coming back in anything like its previous form. We also understand that there have been no complaints that indicate that anyone had suffered damage and distress in the very short period the application was in operation.

We do not want to discourage innovation but it is important that organisations should consider privacy throughout the development and implementation of new projects. Failing to do so risks undermining people’s trust in an organisation. We strongly recommend that if you are considering further projects involving the use of online information and technologies you should carry out and publish a privacy impact assessment. This will help you to build trust and engage with the wider public. Guidance on this can be found in our PIA Code of Practice. We also recommend that you look at our paper on Big Data and Data Protection and our Personal Information Online Code of Practice. Building trust and adopting an ethical approach to such projects can also help to ensure you handle people’s information in a fair and transparent way. We would be very happy to advise the Samaritans on data protection compliance in relation any future projects.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

18 Comments

Filed under Data Protection, Information Commissioner, social media

Information Tribunal increases monetary penalty for company which made spam calls

The trouble with asking for a second opinion is it might be worse than the first one. Reactiv Media get an increased penalty after appealing to the tribunal.

In 2013 the First-tier Tribunal (Information Rights) (“FTT”) heard the first appeal against a monetary penalty notice (“MPN”) imposed by the Information Commissioner’s Office (“ICO”). One of the first things in the appeal (brought by the Central London Community Healthcare NHS Trust) to be considered was the extent of the FTT’s jurisdiction when hearing such appeals – was it, as the ICO suggested, limited effectively only to allowing challenges on public law principles? (e.g. that the original decision was irrational, or failed to take relevant factors into account, or took irrelevant factors into account) or was it entitled to approach the hearing de novo, with the power to determine that the ICO’s discretion to serve an MPN had been exercised wrongly, on the facts? The FTT held that the latter approach (similar to the FTT’s jurisdiction in appeals brought under the Freedom of Information Act 2000 (FOIA)) was the correct one, and, notably, it added the observation (at para. 39) that it was open to the FTT also to increase, as well as decrease, the amount of penalty imposed.

So, although an appeal to the FTT is generally a low-risk low-cost way of having the ICO’s decision reviewed, it does, in the context of MPNs served either under the Data Protection Act 1998 (DPA) or the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), potentially carry the risk of an increased penalty. And this is precisely what happened when a direct marketing company called Reactiv Media recently appealed an ICO MPN. Reactiv Media bad been held to have made a large number of unsolicited telephone calls to people who had subscribed to the Telephone Preference Service (“TPS”) – the calls were thus in contravention of Reactiv Media’s obligations under regulation 21 of PECR. The ICO determined that this constituted a serious contravention of those obligations, and as some at least of those calls were of a kind likely to cause (or indeed had caused) substantial damage or substantial distress, an MPN of £50,000 was served, under the mechanisms of section 55 of the DPA, as adopted by PECR.

Upon appeal to the FTT, Reactiv Media argued that some of the infringing calls had not been made by them, and disputed that any of them had caused substantial damage or distress. However, the FTT, noting the ICO’s submission that not only had the MPN been properly served, but also that it was lenient for a company with a turnover of £5.8m (a figure higher than the one the ICO had initially been given to understand), held that not only was the MPN “fully justified” – the company had “carried on its business in conscious disregard of its obligations” – but also that the amount should be increased by 50%, to £75,ooo. One presumes, also, that the company will not be given a further opportunity (as they were in the first instance) to take advantage of an early payment reduction.

One is tempted to assume that Reactiv Media thought that an appeal to the FTT was a cheap way of having a second opinion about the original MPN. I don’t know if this is true, but it if is, it is a lesson to other data controllers and marketers that, after an appeal, they might find themselves worse off.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under Data Protection, Information Commissioner, Information Tribunal, marketing, monetary penalty notice, nuisance calls, PECR

Vidal-Hall v Google, and the rise of data protection ambulance-chasing

Everyone knows the concept of ambulance chasers – personal injury lawyers who seek out victims of accidents or negligence to help/persuade the latter to make compensation claims. With today’s judgment in the Court of Appeal in the case of Vidal-Hall & Ors v Google [2015] EWCA Civ 311 one wonders if we will start to see data protection ambulance chasers, arriving at the scene of serious “data breaches” with their business cards.

This is because the Court has made a definitive ruling on the issue, discussed several times previously on this blog, of whether compensation can be claimed under the Data Protection Act 1998 (DPA) in circumstances where a data subject has suffered distress but no tangible, pecuniary damage. Section 13 of the DPA provides that

(1)An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

(2)An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if—

(a)the individual also suffers damage by reason of the contravention

This differs from the wording of the European Data Protection Directive 95/46/ec, which, at Article 23(1) says

Member States shall provide that any person who has suffered damage as a result of an unlawful processing operation or of any act incompatible with the national provisions adopted pursuant to this Directive is entitled to receive compensation from the controller for the damage suffered

It can be seen that, in the domestic statutory scheme “distress” is distinct from “damage”, but in the Directive, there is just a single category of “damage”. The position until relatively recently, following Johnson v Medical Defence Union [2007] EWCA Civ 262, had been that it meant pecuniary damage, and this in turn meant, as Buxton LJ said in that case, that “section 13 distress damages are only available if damage in the sense of pecuniary loss has been suffered”. So, absent pecuniary damage, no compensation for distress was available (except in certain specific circumstances involving processing of personal data for journalistic, literary or artistic purposes). But, this, said Lord Dyson and Lady Justice Sharp, in a joint judgment, was wrong, and, in any case, they were not bound by Johnson because the relevant remarks in that case were in fact obiter.  In fact, they said, section 13(2) DPA was incompatible with Article 23 of the Directive:

What is required in order to make section 13(2) compatible with EU law is the disapplication of section 13(2), no more and no less. The consequence of this would be that compensation would be recoverable under section 13(1) for any damage suffered as a result of a contravention by a data controller of any of the requirements of the DPA

As Christopher Knight says, in a characteristically fine and exuberant piece on the Panopticon blog, “And thus, section 13(2) was no more”.

And this means a few things. It certainly means that it will be much easier for an aggrieved data subject to bring a claim for compensation against a data controller which has contravened its obligations under the DPA in circumstances where there is little, or no, tangible or pecuniary damage, but only distress. It also means that we may well start to see the rise of data protection ambulance chasers – the DPA may not give rise to massive settlements, but it is a relatively easy claim to make – a contravention is often effectively a matter of fact, or is found to be such by the Information Commissioner, or is conceded/admitted by the data controller – and there is the prospect of group litigation (in 2013 Islington Council settled claims brought jointly by fourteen claimants following disclosure of their personal data to unauthorised third parties – the settlement totalled £43,000).

I mentioned in that last paragraph that data controller sometimes concede or admit to contraventions of their obligations under the DPA. Indeed, they are expected to by the Information Commissioner, and the draft European General Data Protection Regulation proposes to make it mandatory to do so, and to inform data subjects. And this is where I wonder if we might see another effect of the Vidal-Hall case – if data controller know that by owning up to contraventions they may be exposing themselves to multiple legal claims for distress compensation, they (or their shareholders, or insurers) may start to question why they should do this. Breach notification may be seen as even more of a risky exercise than it is now.

There are other interesting aspects to the Vidal-Hall case – misuse of private information is, indeed, a tort, allowing service of the claims against Google outside jurisdiction, and there are profound issues regarding the definition of personal data which are undecided and, if they go to trial, will be extremely important – but the disapplying of section 13(2) DPA looks likely to have profound effects for data controllers, for data subjects, for lawyers and for the landscape of data protection litigation in this country.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

17 Comments

Filed under Breach Notification, damages, Data Protection, Directive 95/46/EC, GDPR, Information Commissioner

A data protection justice gap?

On the 4th March the Supreme Court handed down judgment in the conjoined cases of Catt and T v Commissioner of Police of the Metropolis ([2015] UKSC 9). Almost unanimously (there was one dissenting opinion in Catt) the appeals by the Met were allowed. In brief, the judgments held that the retention of historical criminal conviction data was proportionate. But what I thought was particularly interesting was the suggestion (at paragraph 45) by Lord Sumption (described to me recently as “by far the cleverest man in England”) that T‘s claim at least had been unnecessary:

[this] was a straightforward dispute about retention which could have been more appropriately resolved by applying to the Information Commissioner. As it is, the parties have gone through three levels of judicial decision, at a cost out of all proportion to the questions at stake

and as this blog post suggests, there was certainly a hint that costs might flow in future towards those who choose to litigate rather than apply to the Information Commissioner’s Office (ICO).

But I think there’s a potential justice gap here. Last year the ICO consulted on changing how it handled concerns from data subjects about handling of their personal data. During the consultation period Dr David Erdos wrote a guest post for this blog, arguing that

The ICO’s suggested approach is hugely problematic from a rule of law point of view. Section 42 of the Data Protection Act [DPA] is crystal clear that “any person who is, or believes himself to be, directly affect by any processing of personal data” may make a request for assessment to the ICO “as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions” of the Act. On receiving such a request the Commissioner “shall make an assessment” (s. 42 (1)) (emphasis added). This duty is an absolute one

but the ICO’s response to the consultation suggested that

We are…planning to make much greater use of the discretion afforded to us under section 42 of the legislation…so long as a data controller has provided an individual with a clear explanation of their processing of personal information, they are unlikely to need to describe their actions again to us if the matter in question does not appear to us to represent a serious issue or we don’t believe there is an opportunity for the data controller to improve their information rights practice

which is problematic, as section 42 confers a discretion on the ICO only as to the manner in which an assessment shall be made. Section 42(3) describes some matters to which he may have regard in determining the manner, and these include (so are not exhaustive) “the extent to which the request appears to him to raise a matter of substance”. I don’t think “a matter of substance” gets close to being the same as “a serious issue”: a matter can surely be non-serious yet still of substance. So if the discretion afforded to the ICO under section 42 as to the manner of the assessment includes a discretion to rely solely on prior correspondence between the data controller and the data subject, this is not specified in (and can only be inferred from) section 42.

Moreover, and interestingly, Article 28(4) of the European Data Protection Directive, which is transposed in section 42 DPA, confers no such discretion as to the manner of assessment, and this may well have been one of the reasons the European Commission began protracted infraction proceedings against the UK (see Chris Pounder blog posts passim).

Nonetheless, the outcome of the ICO consultation was indeed a new procedure for dealing with data subjects’ concerns. Their website now says

Should I raise my concern with the ICO?

If the organisation has been unable, or unwilling, to resolve your information rights concern, you can raise the matter with us.  We will use the information you have provided, including the organisation’s response to your concerns, to decide if your concern provides an opportunity to improve information rights practice.

If we think it does provide that opportunity, we will take appropriate action

“Improving information rights practice” refers to the ICO’s general duties under section 51 DPA, but what is notable by its absence there, though, is any statement that the ICO’s general duty, under section 42, to make an assessment as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions of the DPA.

Lord Sumption in Catt (at 34) also said that “Mr Catt could have complained about the retention of his personal data to the Information Commissioner”. This is true, but would the ICO have actually done anything? Would it have represented a “serious issue”? Possibly not  – Lord Sumption describes the background to Mrs T’s complaints as a “minor incident” and the retention of her data as a “straightforward dispute”. But if there are hints from the highest court of the land that bringing judicial review proceedings on data protection matters might results in adverse costs, because a complaint to the ICO is available, and if the ICO, however, shows reluctance to consider complaints and concerns from aggrieved data subjects, is there an issue with access to data protection justice? Is there a privacy justice gap?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner

Parties, party leaders and data protection registration

UPDATE 24.03.15 The ICO has confirmed to me that none of George Galloway, the Respect Party and Nigel Farage has an entry on the statutory register of data controllers (section 19 of the Data Protection 1998 refers). Might they, therefore, be committing a criminal offence? Natalie Bennett, not being an elected representative, does not necessarily need to register. END UPDATE

George Galloway, the Respect Party, Nigel Farage and Natalie Bennett all appear not to have an entry in the ICO’s online register of data controllers. Failure to have an entry in the actual register constitutes a criminal offence if no exemption can be claimed.

I’ve written before on the subject of politicians and notification under the Data Protection Act 1998 (DPA). To recap:

Section 17 of the DPA states in broad terms that a data controller (a person who solely or jointly “determines the purposes for which and the manner in which any personal data are, or are to be, processed”) must not process personal data unless “an entry in respect of the data controller is included in the register maintained by the [Information] Commissioner” (IC) or unless a relevant exemption to registration applies. Accordingly (under section 18) a relevant data controller must make a notification to the IC stating (again in broad terms) what data it is processing and for what purposes, and must pay a fee of either £35 or £500 (depending on the size of the organisation which is the controller). Section 19 describes the register itself and also provides that registration lasts for twelve months, after which a renewed notification must be made, with payment of a further fee.

Section 21 creates an offence the elements of which will be made out if a data controller who cannot claim an exemption processes personal data without an entry being made in the register. Thus, if a data controller processes personal data and has not notified the IC either initially or at the point of renewal, that controller will be likely to have committed a criminal offence (there is a defence if the controller can show that he exercised all due diligence to comply with the duty).

Political parties, and members of parliaments process personal data (for instance of their constituents) in the role of data controller, and cannot avail themselves of an exemption. Thus, they have an obligation to register, and thus it is, for example, that the Prime Minister has this entry in the register

Untitled

and so it is that Stuart Agnew, UKIP Member of the European Parliament, has this entry

Untitled2

and so it is that the Liberal Democrats have this entry

Untitled2

(all the entries have more information in them than those screenshots show).

But, as I have written before, not all politicians appear to comply with these legal obligations under the DPA. And this morning I noticed lawyer Adam Rose tweeting about the fact that neither George Galloway MP, nor his Respect Party appeared to have an entry on the IC register. This certainly seems to be the case, and I took the opportunity to ask Mr Galloway whether it was correct (no response as yet). It is also worth noting that back in 2012 the IC stated that

it appears that the Respect Party has not notified under the DPA at any time since its formation in November 2004….[this has] been brought to the attention of our Non-Notification Team within our Enforcement Department. They will therefore consider what further action is appropriate in the circumstances

It must be born in mind, however, that non-appearance on the online searchable register is not proof of non-appearance on the actual register. The IC says

It is updated daily. However, due to peaks of work it may be some time before new notifications, renewals and amendments appear in the public register. Please note data controllers are deemed notified from the date we receive a valid form and fee. Therefore the fact that an entry does not appear on the public register does not mean that the data controller is committing a criminal offence

Nonetheless, the online register is there for a purpose – it enables data subjects to get reassurance that those who process their personal data do so lawfully. Non-appearance on the online register is at least cause for concern and the need for clarification from the IC and/or the data controller.

And it is not just Mr Galloway and the Respect Party who don’t appear on the online register. I checked for registrations for some of the other main party leaders: David Cameron, Ed(ward) Miliband and Nick Clegg all have registrations, as do Nicola Sturgeon and Peter Robinson, but Nigel Farage, Leader of UKIP and Natalie Bennett, Leader of the Green Party appear not to.

At all times, but especially in the run up to the general election, voters and constituents have a right to have their personal information handled lawfully, and a right to reassurances from politicians that they will do so. For this reason, it would be good to have clarification from Mr Galloway, the Respect Party, Mr Farage and Ms Bennett, as to why they have no entry showing in the IC’s online register. And if they do not have an entry in the register itself, it would be good to have clarification from the IC as to what action might be taken.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, Information Commissioner