Category Archives: Data Protection

FOI, data protection and rogue landlords 

On 23rd July the Chartered Institute of Environmental Health (CIEH), in conjunction with the Guardian, published a database of landlords who have been convicted of offences under the Housing Act 2004. This showed, for example, that one landlord has been prosecuted seven times for issues relating to disrepair and poor state of properties rented out. It also showed apparent regional discrepancies regarding prosecutions, with some councils carrying out only one prosecution since 2006.

This public interest investigative journalism was, however not achieved without a fight: in September last year the information Commissioners office (ICO) issued a decision notice finding that the journalists request for this information had been correctly refused by the Ministry of Justice on the grounds that the information was sensitive personal data and disclosure under the Freedom of Information Act 2000 (FOIA) would contravene the MoJ’s obligations under the Data Protection Act 1998 (DPA). Section 40(2) of FOIA provides that information is exempt from disclosure under FOIA if disclosure would contravene any of the data protection principles in Schedule One of the DPA (it also provides that it would be exempt if disclosure would contravene section 10 of the DPA, but this is rarely invoked). The key data protection principle is the first, which says that personal data must be processed fairly and lawfully, and in particular that the processing must meet one of the conditions in Schedule Two, and also – for sensitive personal data – one of the conditions in Schedule Three.

The ICO, in its decision notice, after correctly determining that information about identifiable individuals (as opposed to companies) within the scope of the request was sensitive personal data (because it was about offences committed by those individuals) did not accept the requester’s submission that a Schedule Three condition existed which permitted disclosure. The only ones which could potentially apply – condition 1 (explicit consent) or condition 5 (information already made public by the individual) – were not engaged.

However, the ICO did not at the time consider the secondary legislation made under condition 10: the Data Protection (Processing of Sensitive Personal Data) Order 2000 provides further bases for processing of sensitive personal data, and, as the the First-tier Tribunal (Information Rights) (FTT) accepted upon appeal by the applicant, part 3 of the Schedule to that Order permits processing where the processing is “in the substantial public interest”, is in connection with “the commission by any person of any unlawful act” and is for journalistic purposes and is done with a “view to the publication of those data by any person and the data controller reasonably believes that such publication would be in the public interest”. In fairness to the ICO, this further condition was identified by them in their response to the appeal.

In this case, the information was clearly sought with a view to the future publication in the CIEH’s Magazine, “Environmental Health News” and the requester was the digital editor of the latter. This, the FTT decided, taken with the (objective) substantial public interest in the publication of the information, was sufficient to make disclosure under FOIA fair and lawful. In a passage (paras 28-30) worth quoting in full the FTT said

Unfit housing is a matter of major public concern and has a significant impact on the health of tenants.  The Housing Act is a key mechanism for local authorities to improve housing standards and protect the health of vulnerable tenants.  One mechanism for doing this is by means of prosecution, another is licensing schemes for landlords.  Local authorities place vulnerable families in accommodation outside their areas tenants seek accommodation, The publication of information about convictions under the Housing Act would be of considerable value to local authorities in discharge of their functions and assist prospective tenants and those assisting them in avoiding landlords with a history of breaches of the Housing Act.

The sanctions under the Housing Act are comparatively small and the  opprobrium of a conviction may well not rank with other forms of criminal misbehaviour, however the potential for harm to others from such activity is very great, the potential for financial benefit from the misbehaviour is also substantial.  Breaches of the Housing Act are economically motivated and what is proposed is a method of advancing the policy objective of the Housing Act by increasing the availability of relevant information to key actors in the rented housing market – the local authorities as regulator and purchaser and the tenants themselves.  Any impact on the data subjects will overwhelmingly be on their commercial reputations rather than more personal matters.

The Tribunal is therefore satisfied that not only is the disclosure of this information in the substantial public interest, but also any reasonably informed data controller with  knowledge of the social needs and the impact of such disclosure would so conclude.

It is relatively rare that sensitive personal data will be disclosed, or ordered to be disclosed, under FOIA, but it is well worth remembering the 2000 Order, particularly when it comes to publication or proposed publication of such data under public interest journalism.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with..

Leave a comment

Filed under Data Protection, Freedom of Information, Information Commissioner, Information Tribunal, journalism, Open Justice

Dear Google…Dear ICO…

On 15 June this year I complained to Google UK. I have had no response, so I have now asked the Information Commissioner’s Office to assess the lawfulness of Google’s actions. This is my email to the ICO

Hi

I would like to complain about Google UK. On 15 June 2015 I wrote to them at their registered address in the following terms

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [redacted]”

I have had no response to this letter, and furthermore I have twice contacted Google UK’s twitter account “@googleuk” to ask about a response, but have had none.

I am now asking, pursuant to my right to do so at section 42 of the Data Protection Act 1998, for you to conduct an assessment as to whether it is likely or unlikely that the processing by Google UK has been or is being carried out in compliance with the provisions of that Act.

I note that in Case C‑131/12 the Grand Chamber of the Court of Justice of the European Union held that “when the operator of a search engine sets up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State” then “the processing is carried out in the context of the activities of an establishment of the controller on the territory of the Member State”. I also note that Google UK’s notification to your offices under section 18 of the Data Protection Act 1998 says “We process personal information to enable us to promote our goods and services”. On this basis alone I would submit that Google UK is carrying out processing as a data controller in the UK jurisdiction.

I hope I have provided sufficient information for you to being to assess Google UK’s compliance with its obligations under the Data Protection Act 1998, but please contact me if you require any further information.

with best wishes,

Jon Baines

Leave a comment

Filed under Data Protection, Information Commissioner

What does it take to stop Lib Dems spamming?

Lib Dems continue to breach ePrivacy law, ICO still won’t take enforcement action.

It’s not difficult: the sending of unsolicited marketing emails to me is unlawful. Regulation 22 of The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) and by extension, the first and second principles in Schedule One of the Data Protection Act 1998 (DPA) make it so. The Liberal Democrats have engaged in this unlawful practice – they know and the Information Commissioner’s Office (ICO) know it, because the latter recently told the former that they have, and told me in turn

I have reviewed your correspondence and the [Lib Dem’s] website, and it appears that their current practices would fail to comply with the requirements of the PECR. This is because consent is not knowingly given, clear and specific….As such, we have written to the organisation to remind them of their obligations under the PECR and ensure that valid consent is obtained from individuals

But the ICO has chosen not to take enforcement action, saying to me in an email of 24th April

enforcement action is not taken routinely and it is our decision whether to take it. We cannot take enforcement action in every case that is reported to us

Of course I’d never suggested they take action in every case – I’d requested (as is my right under regulation 32 of PECR) that they take action in this particular case. The ICO also asked for the email addresses I’d used; I gave these over assuming it was for the purposes of pursuing an investigation but no, when I later asked the ICO they said they’d passed them to the Lib Dems in order that they could be suppressed from the Lib Dem mailing list. I could have done that if I wanted to. It wasn’t the point and I actually think the ICO were out of order (and contravening the DPA themselves) in failing to tell me that was the purpose.

But I digress. Failure to comply with PECR and the DPA is rife across the political spectrum and I think it’s strongly arguable that lack of enforcement action by the ICO facilitates this. And to illustrate this, I visited the Lib Dems’ website recently, and saw the following message

Untitled

Vacuous and vague, I suppose, but I don’t disagree, so I entered an email address registered to me (another one I reserve for situations where I fear future spamming) and clicked “I agree”. By return I got an email saying

Friend – Thank you for joining the Liberal Democrats…

Wait – hold on a cotton-picking minute – I haven’t joined the bloody Liberal Democrats – I put an email in a box! Is this how they got their recent, and rather-hard-to-explain-in-the-circumstances “surge” in membership? Am I (admittedly using a pseudonym) now registered with them as a member? If so, that raises serious concerns about DPA compliance – wrongly attributing membership of a political party to someone is processing of sensitive personal data without a legal basis.

It’s possible that I haven’t yet been registered as such, because the email went on to say

Click here to activate your account

When I saw this I actually thought the Lib Dems might have listened to the ICO – I assumed that if I didn’t (I didn’t) “click here” I would hear no more. Not entirely PECR compliant, but a step in the right direction. But no, I’ve since received an email from the lonely Alistair Carmichael asking me to support the Human Rights Act (which I do) but to support it by joining a Lib Dem campaign. This is direct marketing of a political party, I didn’t consent to it, and it’s sending was unlawful.

I’ll report it to the ICO, more in hope than expectation that they will do anything. But if they don’t, I think they have to accept that a continuing failure to take enforcement against casual abuse of privacy laws is going to lead to a proliferation of that abuse.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with..

2 Comments

Filed under consent, Data Protection, enforcement, Information Commissioner, marketing, PECR, spam

No Information Rights Levy for ICO – where now for funding?

The ICO’s plan for an “information rights levy” appears to have been scuppered by the government. But is retaining data protection notification fees the way to solve the funding problem?

Back in the heady days of January 2012, when a naive but optimistic European Commission proposed a General Data Protection Regulation (GDPR), to replace the existing 1995 Directive, one of the less-commented-on proposals was to remove the requirement for data controllers to notify their processing activities to the national data protection authority. However, the UK Information Commissioner’s Office (ICO) certainly noticed it, because the implications were that, at a stroke, a large amount of ICO funding would disappear. Currently, section 18(5) of the Data Protection Act 1998 (DPA), and accompanying secondary legislation, mean that data controllers (unless they have an exemption) must pay an annual fee to the ICO of either £35 or £500 (depending upon the size of the organisation). In 2012-2013 this equated to an estimated income of £17.4m, and this income effectively funds all of the ICO’s data protection regulatory actions (its FOI functions are funded by grant-in-aid from the Ministry of Justice).

Three years later, and the GDPR is still not with us. However, it will eventually be passed, and when it is, it seems certain that the requirement under European law to notify will be gone. Because of this, as the Justice Committee recognised in 2013, alternative ICO funding means need to be identified as soon as possible. The ICO’s preferred choice, and one which Christopher Graham has certainly been pushing for, was an “Information Rights Levy”, the details of which were not specified, but which it appears was proposed to be paid by data controllers and public authorities (subject to FOI) alike. In the 2013/14 ICO Annual Report Graham was bullish in calling for action:

Parliament needs to get on with the task of establishing a single, graduated information rights levy to fund the important work of the ICO as the effective upholder of our vital right to privacy and right to know

But this robust approach doesn’t seem to have worked. At a recent meeting of the ICO Management Board a much more pessimistic view emerges. In a report entitled “Registration Fee Strategy” it is said that

The ICO has previously highlighted the need for an ‘information rights fee’ or one fee, paid by organisations directly to the ICO, to fund all information rights activities. Given concerns across government that this would result in private sector cross subsidising public sector work, the ICO recognises that this is unlikely in the short term

The report goes on, therefore, to talk about proposed changes to the current fee/notification process, and about ways of identifying who needs to pay. 

But, oddly, it seems to assume that although the GDPR will remove the requirement for a data controller  to notify processing to the ICO, the UK will retain the discretion to continue with such arrangements (and to charge a fee). I’m not sure this is right. As I’ve written previously, under data protection law at least some recreational bloggers have a requirement to notify (and pay a fee), and the legal authorities are clear that the law’s ambit extends to, for instance, individuals operating domestic CCTV, if that CCTV covers public places where identifiable individuals are. Indeed, as the 2004 Lindqvist case found 

The act of referring, on an internet page, to various persons and identifying them by name or by other means, for instance by giving their telephone number…constitutes ‘the processing of personal data…[and] is not covered by any of the exceptionsin Article 3(2) of Directive 95/46 [section 36 of the DPA transposes Article 3(2) into domestic law]

It is arguable that, to varying extents, we are all data controllers now (and ones who will struggle to avail ourselves of the data protection exemption for domestic purposes). Levying a fee on all of us, in order that we can lawfully express ourselves, has the potential to be a serious infringement of our right to freedom of expression under Article 10 of the European Convention on Human Rights, and even more directly, Article 11 of the Charter of Fundamental Rights of the European Union.

The problem of how to effectively fund the ICO in a time of austerity is a challenging one, and I don’t envy those at the ICO and in government who are trying to solve it, but levying a tax on freedom of expression (which notification arguably already is, and would almost certainly be if the GDPR doesn’t actually require notification) is not the way to do so.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with..

1 Comment

Filed under Data Protection, Directive 95/46/EC, GDPR, Information Commissioner, Uncategorized

Google’s Innuendo

If you search on Google for my name, Jon Baines, or the full version, Jonathan Baines, you see, at the foot of the page of search results

Some results may have been removed under data protection law in Europe. Learn more

Oh-ho! What have I been up to recently? Well, not much really, and certainly nothing that might have led to results being removed under data protection law. Nor similarly, have John Keats, Eleanor Roosevelt and Nigel Molesworth (to pick a few names at random), a search on all of whose names brings up the same message. And, of course, if you click the hyperlink marked by the words “Learn more” you find out in fact that Google has simply set its algorithms to display the message in Europe

when a user searches for most names, not just pages that have been affected by a removal.

It is a political gesture – one that reflects Google’s continuing annoyance at the 2014 decision – now forever known as “Google Spain” – of the Court of Justice of the European Union which established that Google is a data controller for the purpose of search returns containing personal data, and that it must consider requests from data subjects for removal of such personal data. A great deal has been written about this, some bad and some good (a lot of the latter contained in the repository compiled by Julia Powles and Rebekah Larsen) and I’m not going to try to add to that, but what I have noticed is that a lot of people see this “some results may have been removed” message, and become suspicious. For instance, this morning, I noticed someone tweeting to the effect that the message had come up on a search for “Chuka Umunna”, and their supposition was that this must relate to something which would explain Mr Umunna’s decision to withdraw from the contest for leadership of the Labour Party. A search on Twitter for “some results may have” returns a seething mass of suspicion and speculation.

Google is conducting an unnecessary exercise in innuendo. It could easily rephrase the message (“With any search term there is a possibility that some results may have been removed…”) but chooses not to do so, no doubt because it wants to undermine the effect of the CJEU’s ruling. It’s shoddy, and it drags wholly innocent people into its disagreement.

Furthermore, there is an argument that the exercise could be defamatory. I am not a lawyer, let alone a defamation lawyer, so I will leave it to others to consider that argument. However, I do know a bit about data protection, and it strikes me that, following Google Spain, Google is acting as a data controller when it processes a search on my name, and displays a list of results with the offending “some results may have been removed” message. As a data controller it has obligations, under European law (and UK law), to process my personal data “fairly and lawfully”. It is manifestly unfair, as well as wrong, to insinuate that information relating to me might have been removed under data protection law. Accordingly, I’ve written to Google, asking the message to be removed

Google UK Ltd
Belgrave House
76 Buckingham Palace Road
London SW1W 9TQ

16 May 2015

Dear Google

Complaint under Data Protection Act 1998

When a search is made on Google for my name “Jonathan Baines”, and, alternatively, “Jon Baines”, a series of results are returned, but at the foot of the page a message (“the message”) is displayed:

Some results may have been removed under data protection law in Europe. Learn more

To the best of my knowledge, no results have in fact been removed.

The first principle in Schedule One of the Data Protection Act 1998 (DPA) requires a data controller to process personal data fairly and lawfully. In the circumstances I describe, “Jonathan Baines”, “Jon Baines” and the message constitute my personal data, of which you are clearly data controller.

It is unfair to suggest that some results may have been removed under data protection law. This is because the message carries an innuendo that what may have been removed was content that was embarrassing, or that I did not wish to be returned by a Google search. This is not the case. I do not consider that the hyperlink “Learn more” nullifies the innuendo: for instance, a search on Twitter for the phrase “some results may have been removed” provides multiple examples of people assuming the message carries an innuendo meaning.

Accordingly, please remove the message from any page containing the results of a search on my name Jonathan Baines, or Jon Baines, and please confirm to me that you have done so. You are welcome to email me to this effect at [REDACTED]

With best wishes,
Jon Baines

 

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with. Some words may have been removed under data protection law.

7 Comments

Filed under Data Protection, Europe

Shameless

Only very recently I wrote about how the Liberal Democrats had been found by the Information Commissioner’s Officer (ICO) to have been in breach of their obligations under anti-spam laws (or, correctly, the ICO had determined it was “unlikely” the Lib Dems had complied with the law). This was because they had sent me unsolicited emails promoting their party without my consent, in contravention of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). The ICO told me that “we have written to the organisation to remind them of their obligations under the PECR and ensure that valid consent is obtained from individuals”.

Well, the reminder hasn’t worked: today I went on the Lib Dem site and noticed the invitation to agree that “The NHS needs an extra £8bn”. Who could disagree? There was a box to enter my email address and “back our campaign”. Which campaign did they mean? Who knows? I assumed the campaign to promote NHS funding, but there was no privacy notice at all (at least on the mobile site). I entered an email address, because I certainly agree with a campaign that the NHS needs an extra £8bn pounds, but what I certainly didn’t do was consent to receive email marketing.

Untitled

But of course I did…within eight hours I received an email from someone called Olly Grender asking me to donate to the Lib Dems. Why on earth would I want to do that? And a few hours later I got an email from Nick Clegg himself, reiterating Olly’s message. Both emails were manifestly, shamelessly, sent in contravention of PECR, only a couple of weeks after the ICO assured me they were going to “remind” the Lib Dems of the law.

Surely the lesson is the same one the cynics have told us over the years – don’t believe what politicians tell you.

And of course, only this week there was a further example, with the notorious Telegraph “business leaders” letter. The open letter published by the paper, purporting to come from 5000 small business owners, had in fact been written by Conservative Campaign Headquarters, and signatories  were merely people who had filled in a form on the Conservative party website agreeing to sign the letter but who were informed in a privacy notice that “We will not share your details with anyone outside the Conservative Party”. But share they did, and so it was that multiple duplicate signatories, and signatories who were by no means small business owners, found their way into the public domain. Whether any of them will complain to the ICO will probably determine the extent to which this might have been a contravention, not of PECR (this wasn’t unsolicited marketing), but of the Data Protection Act 1998, and the Conservatives’ obligation to process personal data fairly and lawfully. But whatever the outcome, it’s another example of the abuse of web forms, and the harvesting of email addresses, for the promotion of party political aims.

I will be referring the Lib Dems matter back to the ICO, and inviting them again (they declined last time) to take enforcement action for repeat and apparently deliberate, or reckless, contraventions of their legal obligations under PECR.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under consent, Data Protection, Information Commissioner, marketing, PECR, privacy notice, spam

ICO: Samaritans Radar failed to comply with Data Protection Act

I’ve written so much previously on this blog about Samaritans Radar, the misguided Twitter app launched last year by Samaritans, that I’d thought I wouldn’t have to do so again. However, this is just a brief update on the outcome of the investigation by the Information Commissioner’s Office (ICO) into whether Samaritans were obliged to comply with data protection law when running the app, and, if so, the extent to which they did comply.

To recap, the app monitored the timelines of those the user followed on Twitter, and, if certain trigger words or phrases were tweeted, would send an email alert to the user. This was intended to be a “safety net” so that potential suicidal cries for help were not missed. But what was missed by Samaritans was the fact the those whose tweets were being monitored in this way would have no knowledge of it, and that this could lead to a feeling of uncertainty and unease in some of the very target groups they sought to protect. People with mental health issues raised concerns that the app could actually drive people off Twitter, where there were helpful and supportive networks of users, often tweeting the phrases and words the app was designed to pick up.

Furthermore, questions were raised, by me and many others, about the legality of the app under data protection law. So I made a request to the ICO under the Freedom of Information Act for

any information – such as an assessment of legality, correspondence etc. – which you hold about the “Samaritans Radar” app which Samaritans recently launched, then withdrew in light of serious legal and ethical concerns being raised

After an initial refusal because their investigation was ongoing, the ICO have now disclosed a considerable amount of information. Within it, however, is the substantive assessment I sought, in the form of a letter from the Group Manager for Government and Society to Samaritans. I think it is important to post it in full, and I do so below. I don’t have much to add, other than it vindicates the legal position put forward at the time by me and others (notably Susan Hall and Tim Turner).

19 December 2014

Samaritans Radar app

Many thanks for coming to our office and explaining the background to the development of the Radar application and describing how it worked.  We have now had an opportunity to consider the points made at the  meeting, as well as study the information provided in earlier  teleconferences and on the Samaritans’ website. I am writing to let you know our conclusions on how the Data Protection Act applies to the Radar  application.

We recognise that the Radar app was developed with the best of intentions and was withdrawn shortly after its launch but, as you know, during its operation we received a number of queries and concerns about the application. We have been asked for our vtew on whether personal data was processed in compliance with data protection prlnciples and whether the Samaritans are data controllers. You continue to believe that you are not data controllers or that personal data has been processed so I am writing to explain detail our conclusions on these points.

Personal data

Personal data is data that relates to an identifiable living individual. It is  our well-established position that data which identifies an individual, even without a name associated with it, may be personal data where it is processed to learn or record something about that individual, or where the processing of that information has an impact upon that individual. According to the information you have provided, the Radar app was a web-based application that used a specially designed algorithm that searched for specific keywords within the Twitter feeds of subscribers to the Radar app. When words indicating distress were detected within a Tweet, an email alert was automatically sent from the Samaritans to the subscriber saying Radar had detected someone they followed who may be going through a tough time and provided a link to that individual’s Tweet. The email asked the subscriber whether they were worried about the Tweet and if yes, they were re-directed to the Samaritans’ website for guidance on the best way of providing support to a follower who may be distressed. According to your FAQs, you also stored Twitter User IDs, Twitter User friends’ IDs, all tagged Tweets including the raw data associated with it and a count of flags against an individual Twitter user’s friends’ ID. These unique identifiers are personal data, in that they can easily be linked back to identifiable individuals.

Based on our understanding of how the application worked, we have reached the conclusion that the Radar service did involve processing of personal data. It used an algorithm to search for words that triggered an automated decision about an individual, at which point it sent an email alert to a Radar subscriber. It singled out an individual’s data with the purpose of differentiating them and treating them differently. In addition, you also stored information about all the Tweets that were tagged.

Data controller

We are aware of your view that you “are neither the data controller nor data processor of the information passing through the app”.

The concept of a data controller is defined in section 1 of the Data Protection 1998 (the DPA) as

“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”

We have concluded that the Radar service has involved processing of personal data. We understand that you used the agency [redacted] to develop and host the application. We are not fully aware of the role of [redacted] but given your central role in setting up and promoting the Radar application, we consider that the Samaritans have determined the manner and purpose of the processing of this personal data and as such you are data controllers. If you wish to be reminded of the approach we take in this area you may find it helpful to consult our guidance on data controllers and data processors. Here’s the link: https://ico.org.uk/media/about-the-ico/documents/1042555/data-controllers-and-data-processors-dp-guidance.pdf

Sensitive personal data

We also discussed whether you had processed sensitive personal data. You explained that the charity did deal with people seeking help for many different reasons and the service was not aimed at people with possible mental health issues. However the mission of the Samaritans is to alleviate emotional distress and reduce the incidence of suicide feelings and suicidal behaviours. In addition, the stated aims of the Radar project, the research behind it and the information provided in the FAQs all emphasise the aim of helping vulnerable peopie online and using the app to detect someone who is suicidal. For example, you say “research has shown there is a strong correlation between “suicidal tweets” and actual suicides and with Samaritans Radar we can turn a social net into a safety net”. Given the aims of the project, it is highly likely that some of the tweets identified to subscribers included information about an
individual’s mental health or other medical information and therefore would have been sensitive personal data.

At our meetings you said that even if you were processing sensitive personal data then Schedule 3 condiüon 5 (“The information contained in the personal data has been made public as a result of steps deliberately taken by the data subject”) was sufficient to legitimise this processing. Our guidance in our Personal Information Online Code of Practice makes it clear that although people post personal information in a way that becomes publicly visible, organisations still have an overarching duty to handle it fairly and to comply with the rules of data protection. The Samaritans are well respected in this field and receiving an email from your organisation carries a lot of weight. Linking an individual’s tweet to an email alert from the Samaritans is unlikely to be perceived in the same light as the information received in the original Tweet — not least because of the risk that people’s tweets were flagged when they were not in any distress at all.

Fair processing

Any processing of personal data must be fair and organisations must consider the effect of the processing on the individuals concerned and whether the processing would be within their reasonable expectations. You indicated that although you had undertaken some elements of an impact assessment, you had not carried out a full privacy impact assessment. You appear to have reached the conclusion that since the Tweets were publicly visible, you did not need to fully consider the privacy risks. For example, on your website you say that “all the data is public, so user privacy is not an issue. Samaritans Radar analyses the Tweets of people you follow, which are public Tweets. It does not look at private Tweets.”

It is our view that if organisations collect information from the internet and use it in a way that’s unfair, they could still breach the data protection principles even though the information was obtained from a publicly available source. It is particularly important that organisations should consider the data protection implications if they are planning to use analytics to make automated decisions that could have a direct effect on individuals. Under section 12 Of the Data Protection Act, individuals have certain rights to prevent decisions being taken about them that are solely based on automated processing of their personal data. The quality of the data being used as a basis for these decisions may also be an issue.

We note that the application was a year in development and that you used leading academics in linguistics to develop your word search algorithm. You also tested the application on a large number of people, although, as we discussed, most if not of these were connected to the project in some way and many were enthusiastic to see the project succeed. As our recent paper on Big Data explains, it is not so much a question of whether the data accurately records what someone says but rather to what extent that information provides a reliable basis for drawing conclusions. Commentators expressed concern at the apparent high level of false positives involving the Radar App (figures in the media suggest only 4% of email alerts were genuine). This raises questions about whether a System operating with such a low success rate could represent fair processing and indicates that many Tweets were being flagged up unnecessarily.

Since you did not consider yourselves to be data controllers, you have not sought the consent of, or provided fair processing notices to, the individuals whose Tweets you flagged to subscribers. It seems unlikely that it would be within people’s reasonable expectations that certain words and phrases from their Tweets would trigger an automatic email alert from the Samaritans saying Radar had detected someone who may be going throuqh a tough time. Our Personal Information Online Code of Practice says it is good practice to only use publicly available information in a way that is unlikely to cause embarrassment, distress or anxiety to the individual concerned. Organisations should only use their information in a way they are likely to expect and to be comfortable with. Our advice is that if in doubt about this, and you are unable to ask permission, you should not collect their information in the first place.

Conclusion

Based on our observations above, we have reached the conclusion that the Radar application did risk causing distress to individuals and was unlikely to be compliant with the Data Protection Act.

We acknowledge that the Samaritans did take responsibility for dealing with the many concerns raised about the application very quickly. The application was suspended on 7 November and we welcomed [redacted] assurances on 14 November that not only was the application suspended but it would not be coming back in anything like its previous form. We also understand that there have been no complaints that indicate that anyone had suffered damage and distress in the very short period the application was in operation.

We do not want to discourage innovation but it is important that organisations should consider privacy throughout the development and implementation of new projects. Failing to do so risks undermining people’s trust in an organisation. We strongly recommend that if you are considering further projects involving the use of online information and technologies you should carry out and publish a privacy impact assessment. This will help you to build trust and engage with the wider public. Guidance on this can be found in our PIA Code of Practice. We also recommend that you look at our paper on Big Data and Data Protection and our Personal Information Online Code of Practice. Building trust and adopting an ethical approach to such projects can also help to ensure you handle people’s information in a fair and transparent way. We would be very happy to advise the Samaritans on data protection compliance in relation any future projects.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

18 Comments

Filed under Data Protection, Information Commissioner, social media

Information Tribunal increases monetary penalty for company which made spam calls

The trouble with asking for a second opinion is it might be worse than the first one. Reactiv Media get an increased penalty after appealing to the tribunal.

In 2013 the First-tier Tribunal (Information Rights) (“FTT”) heard the first appeal against a monetary penalty notice (“MPN”) imposed by the Information Commissioner’s Office (“ICO”). One of the first things in the appeal (brought by the Central London Community Healthcare NHS Trust) to be considered was the extent of the FTT’s jurisdiction when hearing such appeals – was it, as the ICO suggested, limited effectively only to allowing challenges on public law principles? (e.g. that the original decision was irrational, or failed to take relevant factors into account, or took irrelevant factors into account) or was it entitled to approach the hearing de novo, with the power to determine that the ICO’s discretion to serve an MPN had been exercised wrongly, on the facts? The FTT held that the latter approach (similar to the FTT’s jurisdiction in appeals brought under the Freedom of Information Act 2000 (FOIA)) was the correct one, and, notably, it added the observation (at para. 39) that it was open to the FTT also to increase, as well as decrease, the amount of penalty imposed.

So, although an appeal to the FTT is generally a low-risk low-cost way of having the ICO’s decision reviewed, it does, in the context of MPNs served either under the Data Protection Act 1998 (DPA) or the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), potentially carry the risk of an increased penalty. And this is precisely what happened when a direct marketing company called Reactiv Media recently appealed an ICO MPN. Reactiv Media bad been held to have made a large number of unsolicited telephone calls to people who had subscribed to the Telephone Preference Service (“TPS”) – the calls were thus in contravention of Reactiv Media’s obligations under regulation 21 of PECR. The ICO determined that this constituted a serious contravention of those obligations, and as some at least of those calls were of a kind likely to cause (or indeed had caused) substantial damage or substantial distress, an MPN of £50,000 was served, under the mechanisms of section 55 of the DPA, as adopted by PECR.

Upon appeal to the FTT, Reactiv Media argued that some of the infringing calls had not been made by them, and disputed that any of them had caused substantial damage or distress. However, the FTT, noting the ICO’s submission that not only had the MPN been properly served, but also that it was lenient for a company with a turnover of £5.8m (a figure higher than the one the ICO had initially been given to understand), held that not only was the MPN “fully justified” – the company had “carried on its business in conscious disregard of its obligations” – but also that the amount should be increased by 50%, to £75,ooo. One presumes, also, that the company will not be given a further opportunity (as they were in the first instance) to take advantage of an early payment reduction.

One is tempted to assume that Reactiv Media thought that an appeal to the FTT was a cheap way of having a second opinion about the original MPN. I don’t know if this is true, but it if is, it is a lesson to other data controllers and marketers that, after an appeal, they might find themselves worse off.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under Data Protection, Information Commissioner, Information Tribunal, marketing, monetary penalty notice, nuisance calls, PECR

The Lib Dems’ digital rights bill – an empty promise?

On the 11th of April the Liberal Democrats announced that they would introduce a “Digital Rights Bill” if they were to form part of a coalition government in the next parliament. Among the measures the bill would contain would be, they said

Beefed up powers for the Information Commissioner to fine and enforce disciplinary action on government bodies if they breach data protection lawsLegal rights to compensation for consumers when companies make people sign up online to deliberately misleading and illegible terms & conditions

I found this interesting because the Lib Dems have recently shown themselves particularly unconcerned with digital rights contained in ePrivacy laws. Specifically, they have shown a lack of compliance with the requirement at regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). This regulation forbids the sending of direct marketing by email unless the recipient has notified the sender that she consents to the email being sent. The European directive to which PECR give effect specifies that “consent” should be taken to have been given only by use of

any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an Internet website

And the Information Commissioner’s Office (ICO), which regulates PECR, explains in guidance [pdf] that

the person must understand what they are consenting to. Organisations must make sure they clearly and prominently explain exactly what the person is agreeing to, if this is not obvious. Including information in a dense privacy policy or hidden in ‘small print’ which is hard to find, difficult to understand, or rarely read will not be enough to establish informed consent…consent must be a positive expression of choice. It does not necessarily have to be a proactive declaration of consent – for example, consent might sometimes be given by submitting an online form, if there was a clear and prominent statement that this would be taken as agreement and there was the option to opt out. But organisations cannot assume consent from a failure to opt out

But in July last year I began conducting an experiment. I put my name (actually, typed my email address) to a statement on the Lib Dem website saying

Girls should never be cut. We must end FGM

I gave no consent to the sending of direct email marketing from the Lib Dems, and, indeed, the Lib Dems didn’t even say they would send direct email marketing as a result of my submitting the email address (and, to be clear, the ICO takes the, correct, view [pdf] that promotion of a political party meets the PECR, and Data Protection Act, definition of “marketing”). Yet since October last year they have sent me 23 unsolicited emails constituting direct marketing. I complained directly to the Lib Dems, who told me

we have followed the policies we have set out ion [sic] our privacy policy which follow the guidance we have been given by the ICO

which hardly explains how they feel they have complied with their legal obligations, and I will be raising this as a complaint with the ICO. I could take the route of making a claim under regulation 30 of PECR, but this requires that I must have suffered “damage”. By way of comparison, around the same time I also submitted my email address, in circumstances in which I was not consenting to future receipt of email marketing, to other major parties. To their credit, none of the Conservatives, the SNP and the Greens have sent any unsolicited marketing. However, Labour have sent 8 emails, Plaid Cymru 10 and UKIP, the worst offenders, 37 (there is little that is more nauseating, by the way, than receiving an unsolicited email from Nigel Farage addressing one as “Friend”). I rather suspect that consciously or not, some political parties have decided that the risk of legal or enforcement action (and possibly the apparent ambiguity – although really there is none – about the meaning of “consent”) is so low that it is worth adopting a marketing strategy like this. Maybe that’s a sensible act of political pragmatism. But it stinks, and the Lib Dems’ cavalier approach to ePrivacy compliance makes me completely doubt the validity and sincerity of Nick Clegg’s commitment to

enshrine into law our rights as citizens of this country to privacy, to stop information about us being abused online

And, as Pat Walshe noticed the other day, even the Lib Dems’ own website advert inviting support for their proposed Digital Rights Bill has a pre-ticked box (in non-compliance with ICO guidance) for email updates. One final point, I note that clicking on the link in the first paragraph of this post, to the Lib Dems’ announcement of the proposed Bill, opens up, or attempts to open up, a pdf file of a consultation paper. This might just be a coding error, but it’s an odd, and dodgy, piece of script.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

2 Comments

Filed under consent, Data Protection, Information Commissioner, marketing, PECR, spam

Vidal-Hall v Google, and the rise of data protection ambulance-chasing

Everyone knows the concept of ambulance chasers – personal injury lawyers who seek out victims of accidents or negligence to help/persuade the latter to make compensation claims. With today’s judgment in the Court of Appeal in the case of Vidal-Hall & Ors v Google [2015] EWCA Civ 311 one wonders if we will start to see data protection ambulance chasers, arriving at the scene of serious “data breaches” with their business cards.

This is because the Court has made a definitive ruling on the issue, discussed several times previously on this blog, of whether compensation can be claimed under the Data Protection Act 1998 (DPA) in circumstances where a data subject has suffered distress but no tangible, pecuniary damage. Section 13 of the DPA provides that

(1)An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

(2)An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if—

(a)the individual also suffers damage by reason of the contravention

This differs from the wording of the European Data Protection Directive 95/46/ec, which, at Article 23(1) says

Member States shall provide that any person who has suffered damage as a result of an unlawful processing operation or of any act incompatible with the national provisions adopted pursuant to this Directive is entitled to receive compensation from the controller for the damage suffered

It can be seen that, in the domestic statutory scheme “distress” is distinct from “damage”, but in the Directive, there is just a single category of “damage”. The position until relatively recently, following Johnson v Medical Defence Union [2007] EWCA Civ 262, had been that it meant pecuniary damage, and this in turn meant, as Buxton LJ said in that case, that “section 13 distress damages are only available if damage in the sense of pecuniary loss has been suffered”. So, absent pecuniary damage, no compensation for distress was available (except in certain specific circumstances involving processing of personal data for journalistic, literary or artistic purposes). But, this, said Lord Dyson and Lady Justice Sharp, in a joint judgment, was wrong, and, in any case, they were not bound by Johnson because the relevant remarks in that case were in fact obiter.  In fact, they said, section 13(2) DPA was incompatible with Article 23 of the Directive:

What is required in order to make section 13(2) compatible with EU law is the disapplication of section 13(2), no more and no less. The consequence of this would be that compensation would be recoverable under section 13(1) for any damage suffered as a result of a contravention by a data controller of any of the requirements of the DPA

As Christopher Knight says, in a characteristically fine and exuberant piece on the Panopticon blog, “And thus, section 13(2) was no more”.

And this means a few things. It certainly means that it will be much easier for an aggrieved data subject to bring a claim for compensation against a data controller which has contravened its obligations under the DPA in circumstances where there is little, or no, tangible or pecuniary damage, but only distress. It also means that we may well start to see the rise of data protection ambulance chasers – the DPA may not give rise to massive settlements, but it is a relatively easy claim to make – a contravention is often effectively a matter of fact, or is found to be such by the Information Commissioner, or is conceded/admitted by the data controller – and there is the prospect of group litigation (in 2013 Islington Council settled claims brought jointly by fourteen claimants following disclosure of their personal data to unauthorised third parties – the settlement totalled £43,000).

I mentioned in that last paragraph that data controller sometimes concede or admit to contraventions of their obligations under the DPA. Indeed, they are expected to by the Information Commissioner, and the draft European General Data Protection Regulation proposes to make it mandatory to do so, and to inform data subjects. And this is where I wonder if we might see another effect of the Vidal-Hall case – if data controller know that by owning up to contraventions they may be exposing themselves to multiple legal claims for distress compensation, they (or their shareholders, or insurers) may start to question why they should do this. Breach notification may be seen as even more of a risky exercise than it is now.

There are other interesting aspects to the Vidal-Hall case – misuse of private information is, indeed, a tort, allowing service of the claims against Google outside jurisdiction, and there are profound issues regarding the definition of personal data which are undecided and, if they go to trial, will be extremely important – but the disapplying of section 13(2) DPA looks likely to have profound effects for data controllers, for data subjects, for lawyers and for the landscape of data protection litigation in this country.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

17 Comments

Filed under Breach Notification, damages, Data Protection, Directive 95/46/EC, GDPR, Information Commissioner