ICO: Samaritans Radar failed to comply with Data Protection Act

I’ve written so much previously on this blog about Samaritans Radar, the misguided Twitter app launched last year by Samaritans, that I’d thought I wouldn’t have to do so again. However, this is just a brief update on the outcome of the investigation by the Information Commissioner’s Office (ICO) into whether Samaritans were obliged to comply with data protection law when running the app, and, if so, the extent to which they did comply.

To recap, the app monitored the timelines of those the user followed on Twitter, and, if certain trigger words or phrases were tweeted, would send an email alert to the user. This was intended to be a “safety net” so that potential suicidal cries for help were not missed. But what was missed by Samaritans was the fact the those whose tweets were being monitored in this way would have no knowledge of it, and that this could lead to a feeling of uncertainty and unease in some of the very target groups they sought to protect. People with mental health issues raised concerns that the app could actually drive people off Twitter, where there were helpful and supportive networks of users, often tweeting the phrases and words the app was designed to pick up.

Furthermore, questions were raised, by me and many others, about the legality of the app under data protection law. So I made a request to the ICO under the Freedom of Information Act for

any information – such as an assessment of legality, correspondence etc. – which you hold about the “Samaritans Radar” app which Samaritans recently launched, then withdrew in light of serious legal and ethical concerns being raised

After an initial refusal because their investigation was ongoing, the ICO have now disclosed a considerable amount of information. Within it, however, is the substantive assessment I sought, in the form of a letter from the Group Manager for Government and Society to Samaritans. I think it is important to post it in full, and I do so below. I don’t have much to add, other than it vindicates the legal position put forward at the time by me and others (notably Susan Hall and Tim Turner).

19 December 2014

Samaritans Radar app

Many thanks for coming to our office and explaining the background to the development of the Radar application and describing how it worked.  We have now had an opportunity to consider the points made at the  meeting, as well as study the information provided in earlier  teleconferences and on the Samaritans’ website. I am writing to let you know our conclusions on how the Data Protection Act applies to the Radar  application.

We recognise that the Radar app was developed with the best of intentions and was withdrawn shortly after its launch but, as you know, during its operation we received a number of queries and concerns about the application. We have been asked for our vtew on whether personal data was processed in compliance with data protection prlnciples and whether the Samaritans are data controllers. You continue to believe that you are not data controllers or that personal data has been processed so I am writing to explain detail our conclusions on these points.

Personal data

Personal data is data that relates to an identifiable living individual. It is  our well-established position that data which identifies an individual, even without a name associated with it, may be personal data where it is processed to learn or record something about that individual, or where the processing of that information has an impact upon that individual. According to the information you have provided, the Radar app was a web-based application that used a specially designed algorithm that searched for specific keywords within the Twitter feeds of subscribers to the Radar app. When words indicating distress were detected within a Tweet, an email alert was automatically sent from the Samaritans to the subscriber saying Radar had detected someone they followed who may be going through a tough time and provided a link to that individual’s Tweet. The email asked the subscriber whether they were worried about the Tweet and if yes, they were re-directed to the Samaritans’ website for guidance on the best way of providing support to a follower who may be distressed. According to your FAQs, you also stored Twitter User IDs, Twitter User friends’ IDs, all tagged Tweets including the raw data associated with it and a count of flags against an individual Twitter user’s friends’ ID. These unique identifiers are personal data, in that they can easily be linked back to identifiable individuals.

Based on our understanding of how the application worked, we have reached the conclusion that the Radar service did involve processing of personal data. It used an algorithm to search for words that triggered an automated decision about an individual, at which point it sent an email alert to a Radar subscriber. It singled out an individual’s data with the purpose of differentiating them and treating them differently. In addition, you also stored information about all the Tweets that were tagged.

Data controller

We are aware of your view that you “are neither the data controller nor data processor of the information passing through the app”.

The concept of a data controller is defined in section 1 of the Data Protection 1998 (the DPA) as

“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”

We have concluded that the Radar service has involved processing of personal data. We understand that you used the agency [redacted] to develop and host the application. We are not fully aware of the role of [redacted] but given your central role in setting up and promoting the Radar application, we consider that the Samaritans have determined the manner and purpose of the processing of this personal data and as such you are data controllers. If you wish to be reminded of the approach we take in this area you may find it helpful to consult our guidance on data controllers and data processors. Here’s the link: https://ico.org.uk/media/about-the-ico/documents/1042555/data-controllers-and-data-processors-dp-guidance.pdf

Sensitive personal data

We also discussed whether you had processed sensitive personal data. You explained that the charity did deal with people seeking help for many different reasons and the service was not aimed at people with possible mental health issues. However the mission of the Samaritans is to alleviate emotional distress and reduce the incidence of suicide feelings and suicidal behaviours. In addition, the stated aims of the Radar project, the research behind it and the information provided in the FAQs all emphasise the aim of helping vulnerable peopie online and using the app to detect someone who is suicidal. For example, you say “research has shown there is a strong correlation between “suicidal tweets” and actual suicides and with Samaritans Radar we can turn a social net into a safety net”. Given the aims of the project, it is highly likely that some of the tweets identified to subscribers included information about an
individual’s mental health or other medical information and therefore would have been sensitive personal data.

At our meetings you said that even if you were processing sensitive personal data then Schedule 3 condiüon 5 (“The information contained in the personal data has been made public as a result of steps deliberately taken by the data subject”) was sufficient to legitimise this processing. Our guidance in our Personal Information Online Code of Practice makes it clear that although people post personal information in a way that becomes publicly visible, organisations still have an overarching duty to handle it fairly and to comply with the rules of data protection. The Samaritans are well respected in this field and receiving an email from your organisation carries a lot of weight. Linking an individual’s tweet to an email alert from the Samaritans is unlikely to be perceived in the same light as the information received in the original Tweet — not least because of the risk that people’s tweets were flagged when they were not in any distress at all.

Fair processing

Any processing of personal data must be fair and organisations must consider the effect of the processing on the individuals concerned and whether the processing would be within their reasonable expectations. You indicated that although you had undertaken some elements of an impact assessment, you had not carried out a full privacy impact assessment. You appear to have reached the conclusion that since the Tweets were publicly visible, you did not need to fully consider the privacy risks. For example, on your website you say that “all the data is public, so user privacy is not an issue. Samaritans Radar analyses the Tweets of people you follow, which are public Tweets. It does not look at private Tweets.”

It is our view that if organisations collect information from the internet and use it in a way that’s unfair, they could still breach the data protection principles even though the information was obtained from a publicly available source. It is particularly important that organisations should consider the data protection implications if they are planning to use analytics to make automated decisions that could have a direct effect on individuals. Under section 12 Of the Data Protection Act, individuals have certain rights to prevent decisions being taken about them that are solely based on automated processing of their personal data. The quality of the data being used as a basis for these decisions may also be an issue.

We note that the application was a year in development and that you used leading academics in linguistics to develop your word search algorithm. You also tested the application on a large number of people, although, as we discussed, most if not of these were connected to the project in some way and many were enthusiastic to see the project succeed. As our recent paper on Big Data explains, it is not so much a question of whether the data accurately records what someone says but rather to what extent that information provides a reliable basis for drawing conclusions. Commentators expressed concern at the apparent high level of false positives involving the Radar App (figures in the media suggest only 4% of email alerts were genuine). This raises questions about whether a System operating with such a low success rate could represent fair processing and indicates that many Tweets were being flagged up unnecessarily.

Since you did not consider yourselves to be data controllers, you have not sought the consent of, or provided fair processing notices to, the individuals whose Tweets you flagged to subscribers. It seems unlikely that it would be within people’s reasonable expectations that certain words and phrases from their Tweets would trigger an automatic email alert from the Samaritans saying Radar had detected someone who may be going throuqh a tough time. Our Personal Information Online Code of Practice says it is good practice to only use publicly available information in a way that is unlikely to cause embarrassment, distress or anxiety to the individual concerned. Organisations should only use their information in a way they are likely to expect and to be comfortable with. Our advice is that if in doubt about this, and you are unable to ask permission, you should not collect their information in the first place.

Conclusion

Based on our observations above, we have reached the conclusion that the Radar application did risk causing distress to individuals and was unlikely to be compliant with the Data Protection Act.

We acknowledge that the Samaritans did take responsibility for dealing with the many concerns raised about the application very quickly. The application was suspended on 7 November and we welcomed [redacted] assurances on 14 November that not only was the application suspended but it would not be coming back in anything like its previous form. We also understand that there have been no complaints that indicate that anyone had suffered damage and distress in the very short period the application was in operation.

We do not want to discourage innovation but it is important that organisations should consider privacy throughout the development and implementation of new projects. Failing to do so risks undermining people’s trust in an organisation. We strongly recommend that if you are considering further projects involving the use of online information and technologies you should carry out and publish a privacy impact assessment. This will help you to build trust and engage with the wider public. Guidance on this can be found in our PIA Code of Practice. We also recommend that you look at our paper on Big Data and Data Protection and our Personal Information Online Code of Practice. Building trust and adopting an ethical approach to such projects can also help to ensure you handle people’s information in a fair and transparent way. We would be very happy to advise the Samaritans on data protection compliance in relation any future projects.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

4 Comments

Filed under Data Protection, Information Commissioner, social media

ICO finds Lib Dems in breach of ePrivacy law

A few months ago, when I entered my email address on the Liberal Democrats’ website to say that I agreed with the statement 

Girls should never be cut. We must end FGM

I hoped I wouldn’t subsequently receive spam emails promoting the party. However I had no way of knowing because there was no obvious statement explaining what would happen. But, furthermore, I had clearly not given specific consent to receive such emails.

Nonetheless, I did get them, and continue to do so – emails purportedly from Nick Clegg, from Paddy Ashdown and from others, promoting their party and sometimes soliciting donations.

I happen to think the compiling of a marketing database by use of serious and emotive subjects such as female genital mutilation is extraordinarily tasteless. It’s also manifestly unlawful in terms of Lib Dems’ obligations under the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), which require specific consent to have been given before marketing emails can be sent to individuals.

On the lawfulness point I am pleased to say the Information Commissioner’s Office (ICO) agrees with me. Having considered my complaint they have said:

I have reviewed your correspondence and the organisations website, and it appears that their current practices would fail to comply with the requirements of the PECR. This is because consent is not knowingly given, clear and specific….As such, we have written to the organisation to remind them of their obligations under the PECR and ensure that valid consent is obtained from individuals.

Great. I’m glad they agree – casual disregard of PECR seems to be rife throughout politics. As I’ve written recently, the Labour Party, UKIP and Plaid Cymru have also spammed my dedicated email account. But I also asked the ICO to consider taking enforcement action (as is my right under regulation 32 of PECR). Disappointingly, they have declined to do so, saying:

enforcement action is not taken routinely and it is our decision whether to take it. We cannot take enforcement action in every case that is reported to us

It’s also disappointing that they don’t say why this is their decision. I know they cannot take enforcement action in every case reported to them, which is why I requested it in this specific case.

However, I will be interested to see whether the outcome of this case changes the Lib Dems’ approach. Maybe it will, but, as I say, they are by no means the only offenders, and enforcement action by the ICO might just have helped to address this wider problem.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

4 Comments

Filed under consent, enforcement, Information Commissioner, marketing, PECR, spam, Uncategorized

Information Tribunal increases monetary penalty for company which made spam calls

The trouble with asking for a second opinion is it might be worse than the first one. Reactiv Media get an increased penalty after appealing to the tribunal.

In 2013 the First-tier Tribunal (Information Rights) (“FTT”) heard the first appeal against a monetary penalty notice (“MPN”) imposed by the Information Commissioner’s Office (“ICO”). One of the first things in the appeal (brought by the Central London Community Healthcare NHS Trust) to be considered was the extent of the FTT’s jurisdiction when hearing such appeals – was it, as the ICO suggested, limited effectively only to allowing challenges on public law principles? (e.g. that the original decision was irrational, or failed to take relevant factors into account, or took irrelevant factors into account) or was it entitled to approach the hearing de novo, with the power to determine that the ICO’s discretion to serve an MPN had been exercised wrongly, on the facts? The FTT held that the latter approach (similar to the FTT’s jurisdiction in appeals brought under the Freedom of Information Act 2000 (FOIA)) was the correct one, and, notably, it added the observation (at para. 39) that it was open to the FTT also to increase, as well as decrease, the amount of penalty imposed.

So, although an appeal to the FTT is generally a low-risk low-cost way of having the ICO’s decision reviewed, it does, in the context of MPNs served either under the Data Protection Act 1998 (DPA) or the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), potentially carry the risk of an increased penalty. And this is precisely what happened when a direct marketing company called Reactiv Media recently appealed an ICO MPN. Reactiv Media bad been held to have made a large number of unsolicited telephone calls to people who had subscribed to the Telephone Preference Service (“TPS”) – the calls were thus in contravention of Reactiv Media’s obligations under regulation 21 of PECR. The ICO determined that this constituted a serious contravention of those obligations, and as some at least of those calls were of a kind likely to cause (or indeed had caused) substantial damage or substantial distress, an MPN of £50,000 was served, under the mechanisms of section 55 of the DPA, as adopted by PECR.

Upon appeal to the FTT, Reactiv Media argued that some of the infringing calls had not been made by them, and disputed that any of them had caused substantial damage or distress. However, the FTT, noting the ICO’s submission that not only had the MPN been properly served, but also that it was lenient for a company with a turnover of £5.8m (a figure higher than the one the ICO had initially been given to understand), held that not only was the MPN “fully justified” – the company had “carried on its business in conscious disregard of its obligations” – but also that the amount should be increased by 50%, to £75,ooo. One presumes, also, that the company will not be given a further opportunity (as they were in the first instance) to take advantage of an early payment reduction.

One is tempted to assume that Reactiv Media thought that an appeal to the FTT was a cheap way of having a second opinion about the original MPN. I don’t know if this is true, but it if is, it is a lesson to other data controllers and marketers that, after an appeal, they might find themselves worse off.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

 

Leave a comment

Filed under Data Protection, Information Commissioner, Information Tribunal, marketing, monetary penalty notice, nuisance calls, PECR

The Lib Dems’ digital rights bill – an empty promise?

On the 11th of April the Liberal Democrats announced that they would introduce a “Digital Rights Bill” if they were to form part of a coalition government in the next parliament. Among the measures the bill would contain would be, they said

Beefed up powers for the Information Commissioner to fine and enforce disciplinary action on government bodies if they breach data protection lawsLegal rights to compensation for consumers when companies make people sign up online to deliberately misleading and illegible terms & conditions

I found this interesting because the Lib Dems have recently shown themselves particularly unconcerned with digital rights contained in ePrivacy laws. Specifically, they have shown a lack of compliance with the requirement at regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). This regulation forbids the sending of direct marketing by email unless the recipient has notified the sender that she consents to the email being sent. The European directive to which PECR give effect specifies that “consent” should be taken to have been given only by use of

any appropriate method enabling a freely given specific and informed indication of the user’s wishes, including by ticking a box when visiting an Internet website

And the Information Commissioner’s Office (ICO), which regulates PECR, explains in guidance [pdf] that

the person must understand what they are consenting to. Organisations must make sure they clearly and prominently explain exactly what the person is agreeing to, if this is not obvious. Including information in a dense privacy policy or hidden in ‘small print’ which is hard to find, difficult to understand, or rarely read will not be enough to establish informed consent…consent must be a positive expression of choice. It does not necessarily have to be a proactive declaration of consent – for example, consent might sometimes be given by submitting an online form, if there was a clear and prominent statement that this would be taken as agreement and there was the option to opt out. But organisations cannot assume consent from a failure to opt out

But in July last year I began conducting an experiment. I put my name (actually, typed my email address) to a statement on the Lib Dem website saying

Girls should never be cut. We must end FGM

I gave no consent to the sending of direct email marketing from the Lib Dems, and, indeed, the Lib Dems didn’t even say they would send direct email marketing as a result of my submitting the email address (and, to be clear, the ICO takes the, correct, view [pdf] that promotion of a political party meets the PECR, and Data Protection Act, definition of “marketing”). Yet since October last year they have sent me 23 unsolicited emails constituting direct marketing. I complained directly to the Lib Dems, who told me

we have followed the policies we have set out ion [sic] our privacy policy which follow the guidance we have been given by the ICO

which hardly explains how they feel they have complied with their legal obligations, and I will be raising this as a complaint with the ICO. I could take the route of making a claim under regulation 30 of PECR, but this requires that I must have suffered “damage”. By way of comparison, around the same time I also submitted my email address, in circumstances in which I was not consenting to future receipt of email marketing, to other major parties. To their credit, none of the Conservatives, the SNP and the Greens have sent any unsolicited marketing. However, Labour have sent 8 emails, Plaid Cymru 10 and UKIP, the worst offenders, 37 (there is little that is more nauseating, by the way, than receiving an unsolicited email from Nigel Farage addressing one as “Friend”). I rather suspect that consciously or not, some political parties have decided that the risk of legal or enforcement action (and possibly the apparent ambiguity – although really there is none – about the meaning of “consent”) is so low that it is worth adopting a marketing strategy like this. Maybe that’s a sensible act of political pragmatism. But it stinks, and the Lib Dems’ cavalier approach to ePrivacy compliance makes me completely doubt the validity and sincerity of Nick Clegg’s commitment to

enshrine into law our rights as citizens of this country to privacy, to stop information about us being abused online

And, as Pat Walshe noticed the other day, even the Lib Dems’ own website advert inviting support for their proposed Digital Rights Bill has a pre-ticked box (in non-compliance with ICO guidance) for email updates. One final point, I note that clicking on the link in the first paragraph of this post, to the Lib Dems’ announcement of the proposed Bill, opens up, or attempts to open up, a pdf file of a consultation paper. This might just be a coding error, but it’s an odd, and dodgy, piece of script.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under consent, Data Protection, Information Commissioner, marketing, PECR, spam

Vidal-Hall v Google, and the rise of data protection ambulance-chasing

Everyone knows the concept of ambulance chasers – personal injury lawyers who seek out victims of accidents or negligence to help/persuade the latter to make compensation claims. With today’s judgment in the Court of Appeal in the case of Vidal-Hall & Ors v Google [2015] EWCA Civ 311 one wonders if we will start to see data protection ambulance chasers, arriving at the scene of serious “data breaches” with their business cards.

This is because the Court has made a definitive ruling on the issue, discussed several times previously on this blog, of whether compensation can be claimed under the Data Protection Act 1998 (DPA) in circumstances where a data subject has suffered distress but no tangible, pecuniary damage. Section 13 of the DPA provides that

(1)An individual who suffers damage by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that damage.

(2)An individual who suffers distress by reason of any contravention by a data controller of any of the requirements of this Act is entitled to compensation from the data controller for that distress if—

(a)the individual also suffers damage by reason of the contravention

This differs from the wording of the European Data Protection Directive 95/46/ec, which, at Article 23(1) says

Member States shall provide that any person who has suffered damage as a result of an unlawful processing operation or of any act incompatible with the national provisions adopted pursuant to this Directive is entitled to receive compensation from the controller for the damage suffered

It can be seen that, in the domestic statutory scheme “distress” is distinct from “damage”, but in the Directive, there is just a single category of “damage”. The position until relatively recently, following Johnson v Medical Defence Union [2007] EWCA Civ 262, had been that it meant pecuniary damage, and this in turn meant, as Buxton LJ said in that case, that “section 13 distress damages are only available if damage in the sense of pecuniary loss has been suffered”. So, absent pecuniary damage, no compensation for distress was available (except in certain specific circumstances involving processing of personal data for journalistic, literary or artistic purposes). But, this, said Lord Dyson and Lady Justice Sharp, in a joint judgment, was wrong, and, in any case, they were not bound by Johnson because the relevant remarks in that case were in fact obiter.  In fact, they said, section 13(2) DPA was incompatible with Article 23 of the Directive:

What is required in order to make section 13(2) compatible with EU law is the disapplication of section 13(2), no more and no less. The consequence of this would be that compensation would be recoverable under section 13(1) for any damage suffered as a result of a contravention by a data controller of any of the requirements of the DPA

As Christopher Knight says, in a characteristically fine and exuberant piece on the Panopticon blog, “And thus, section 13(2) was no more”.

And this means a few things. It certainly means that it will be much easier for an aggrieved data subject to bring a claim for compensation against a data controller which has contravened its obligations under the DPA in circumstances where there is little, or no, tangible or pecuniary damage, but only distress. It also means that we may well start to see the rise of data protection ambulance chasers – the DPA may not give rise to massive settlements, but it is a relatively easy claim to make – a contravention is often effectively a matter of fact, or is found to be such by the Information Commissioner, or is conceded/admitted by the data controller – and there is the prospect of group litigation (in 2013 Islington Council settled claims brought jointly by fourteen claimants following disclosure of their personal data to unauthorised third parties – the settlement totalled £43,000).

I mentioned in that last paragraph that data controller sometimes concede or admit to contraventions of their obligations under the DPA. Indeed, they are expected to by the Information Commissioner, and the draft European General Data Protection Regulation proposes to make it mandatory to do so, and to inform data subjects. And this is where I wonder if we might see another effect of the Vidal-Hall case – if data controller know that by owning up to contraventions they may be exposing themselves to multiple legal claims for distress compensation, they (or their shareholders, or insurers) may start to question why they should do this. Breach notification may be seen as even more of a risky exercise than it is now.

There are other interesting aspects to the Vidal-Hall case – misuse of private information is, indeed, a tort, allowing service of the claims against Google outside jurisdiction, and there are profound issues regarding the definition of personal data which are undecided and, if they go to trial, will be extremely important – but the disapplying of section 13(2) DPA looks likely to have profound effects for data controllers, for data subjects, for lawyers and for the landscape of data protection litigation in this country.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

7 Comments

Filed under Breach Notification, damages, Data Protection, Directive 95/46/EC, GDPR, Information Commissioner

A data protection justice gap?

On the 4th March the Supreme Court handed down judgment in the conjoined cases of Catt and T v Commissioner of Police of the Metropolis ([2015] UKSC 9). Almost unanimously (there was one dissenting opinion in Catt) the appeals by the Met were allowed. In brief, the judgments held that the retention of historical criminal conviction data was proportionate. But what I thought was particularly interesting was the suggestion (at paragraph 45) by Lord Sumption (described to me recently as “by far the cleverest man in England”) that T‘s claim at least had been unnecessary:

[this] was a straightforward dispute about retention which could have been more appropriately resolved by applying to the Information Commissioner. As it is, the parties have gone through three levels of judicial decision, at a cost out of all proportion to the questions at stake

and as this blog post suggests, there was certainly a hint that costs might flow in future towards those who choose to litigate rather than apply to the Information Commissioner’s Office (ICO).

But I think there’s a potential justice gap here. Last year the ICO consulted on changing how it handled concerns from data subjects about handling of their personal data. During the consultation period Dr David Erdos wrote a guest post for this blog, arguing that

The ICO’s suggested approach is hugely problematic from a rule of law point of view. Section 42 of the Data Protection Act [DPA] is crystal clear that “any person who is, or believes himself to be, directly affect by any processing of personal data” may make a request for assessment to the ICO “as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions” of the Act. On receiving such a request the Commissioner “shall make an assessment” (s. 42 (1)) (emphasis added). This duty is an absolute one

but the ICO’s response to the consultation suggested that

We are…planning to make much greater use of the discretion afforded to us under section 42 of the legislation…so long as a data controller has provided an individual with a clear explanation of their processing of personal information, they are unlikely to need to describe their actions again to us if the matter in question does not appear to us to represent a serious issue or we don’t believe there is an opportunity for the data controller to improve their information rights practice

which is problematic, as section 42 confers a discretion on the ICO only as to the manner in which an assessment shall be made. Section 42(3) describes some matters to which he may have regard in determining the manner, and these include (so are not exhaustive) “the extent to which the request appears to him to raise a matter of substance”. I don’t think “a matter of substance” gets close to being the same as “a serious issue”: a matter can surely be non-serious yet still of substance. So if the discretion afforded to the ICO under section 42 as to the manner of the assessment includes a discretion to rely solely on prior correspondence between the data controller and the data subject, this is not specified in (and can only be inferred from) section 42.

Moreover, and interestingly, Article 28(4) of the European Data Protection Directive, which is transposed in section 42 DPA, confers no such discretion as to the manner of assessment, and this may well have been one of the reasons the European Commission began protracted infraction proceedings against the UK (see Chris Pounder blog posts passim).

Nonetheless, the outcome of the ICO consultation was indeed a new procedure for dealing with data subjects’ concerns. Their website now says

Should I raise my concern with the ICO?

If the organisation has been unable, or unwilling, to resolve your information rights concern, you can raise the matter with us.  We will use the information you have provided, including the organisation’s response to your concerns, to decide if your concern provides an opportunity to improve information rights practice.

If we think it does provide that opportunity, we will take appropriate action

“Improving information rights practice” refers to the ICO’s general duties under section 51 DPA, but what is notable by its absence there, though, is any statement that the ICO’s general duty, under section 42, to make an assessment as to whether it is likely or unlikely that the processing has been or is being carried out in compliance with the provisions of the DPA.

Lord Sumption in Catt (at 34) also said that “Mr Catt could have complained about the retention of his personal data to the Information Commissioner”. This is true, but would the ICO have actually done anything? Would it have represented a “serious issue”? Possibly not  – Lord Sumption describes the background to Mrs T’s complaints as a “minor incident” and the retention of her data as a “straightforward dispute”. But if there are hints from the highest court of the land that bringing judicial review proceedings on data protection matters might results in adverse costs, because a complaint to the ICO is available, and if the ICO, however, shows reluctance to consider complaints and concerns from aggrieved data subjects, is there an issue with access to data protection justice? Is there a privacy justice gap?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

9 Comments

Filed under Data Protection, Directive 95/46/EC, Information Commissioner

Parties, party leaders and data protection registration

UPDATE 24.03.15 The ICO has confirmed to me that none of George Galloway, the Respect Party and Nigel Farage has an entry on the statutory register of data controllers (section 19 of the Data Protection 1998 refers). Might they, therefore, be committing a criminal offence? Natalie Bennett, not being an elected representative, does not necessarily need to register. END UPDATE

George Galloway, the Respect Party, Nigel Farage and Natalie Bennett all appear not to have an entry in the ICO’s online register of data controllers. Failure to have an entry in the actual register constitutes a criminal offence if no exemption can be claimed.

I’ve written before on the subject of politicians and notification under the Data Protection Act 1998 (DPA). To recap:

Section 17 of the DPA states in broad terms that a data controller (a person who solely or jointly “determines the purposes for which and the manner in which any personal data are, or are to be, processed”) must not process personal data unless “an entry in respect of the data controller is included in the register maintained by the [Information] Commissioner” (IC) or unless a relevant exemption to registration applies. Accordingly (under section 18) a relevant data controller must make a notification to the IC stating (again in broad terms) what data it is processing and for what purposes, and must pay a fee of either £35 or £500 (depending on the size of the organisation which is the controller). Section 19 describes the register itself and also provides that registration lasts for twelve months, after which a renewed notification must be made, with payment of a further fee.

Section 21 creates an offence the elements of which will be made out if a data controller who cannot claim an exemption processes personal data without an entry being made in the register. Thus, if a data controller processes personal data and has not notified the IC either initially or at the point of renewal, that controller will be likely to have committed a criminal offence (there is a defence if the controller can show that he exercised all due diligence to comply with the duty).

Political parties, and members of parliaments process personal data (for instance of their constituents) in the role of data controller, and cannot avail themselves of an exemption. Thus, they have an obligation to register, and thus it is, for example, that the Prime Minister has this entry in the register

Untitled

and so it is that Stuart Agnew, UKIP Member of the European Parliament, has this entry

Untitled2

and so it is that the Liberal Democrats have this entry

Untitled2

(all the entries have more information in them than those screenshots show).

But, as I have written before, not all politicians appear to comply with these legal obligations under the DPA. And this morning I noticed lawyer Adam Rose tweeting about the fact that neither George Galloway MP, nor his Respect Party appeared to have an entry on the IC register. This certainly seems to be the case, and I took the opportunity to ask Mr Galloway whether it was correct (no response as yet). It is also worth noting that back in 2012 the IC stated that

it appears that the Respect Party has not notified under the DPA at any time since its formation in November 2004….[this has] been brought to the attention of our Non-Notification Team within our Enforcement Department. They will therefore consider what further action is appropriate in the circumstances

It must be born in mind, however, that non-appearance on the online searchable register is not proof of non-appearance on the actual register. The IC says

It is updated daily. However, due to peaks of work it may be some time before new notifications, renewals and amendments appear in the public register. Please note data controllers are deemed notified from the date we receive a valid form and fee. Therefore the fact that an entry does not appear on the public register does not mean that the data controller is committing a criminal offence

Nonetheless, the online register is there for a purpose – it enables data subjects to get reassurance that those who process their personal data do so lawfully. Non-appearance on the online register is at least cause for concern and the need for clarification from the IC and/or the data controller.

And it is not just Mr Galloway and the Respect Party who don’t appear on the online register. I checked for registrations for some of the other main party leaders: David Cameron, Ed(ward) Miliband and Nick Clegg all have registrations, as do Nicola Sturgeon and Peter Robinson, but Nigel Farage, Leader of UKIP and Natalie Bennett, Leader of the Green Party appear not to.

At all times, but especially in the run up to the general election, voters and constituents have a right to have their personal information handled lawfully, and a right to reassurances from politicians that they will do so. For this reason, it would be good to have clarification from Mr Galloway, the Respect Party, Mr Farage and Ms Bennett, as to why they have no entry showing in the IC’s online register. And if they do not have an entry in the register itself, it would be good to have clarification from the IC as to what action might be taken.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

3 Comments

Filed under Data Protection, Information Commissioner

A cookie for your health problems

Imagine this. You enter a shop (let’s call it Shop A) to browse, and you look at an item of interest (let’s call it Item Q). While you do so, an unbeknown to you, a shop assistant places a sticker on your back, revealing that you looked at this item, and when and where. You leave and a few days later enter another shop, where a shop assistant says “I understand a few days ago you were interested in Item Q, here are some similar items you might be interested in”.

You might initially think “how helpful”, but afterwards you might start to wonder how the second shop knew about your interest, and to think that it’s a bit off that they seemed to have been able to track your movements and interests.

But try this as well. You go to your doctor, because you’re concerned about a medical condition – let’s say you fear you may have a sexually transmitted disease. As you leave the doctor secretly puts a sticker on your back saying when and where you visited and what you were concerned about. You later visit a pharmacy to buy your lunch. While you queue to pay an assistant approaches you and says openly “I understand you’ve been making enquiries recently about STDs – here are some ointments we sell”.

The perceptive reader may by now have realised I am clunkily trying to illustrate by analogy how cookies, and particularly tracking cookies work. We have all come to curse the cookie warning banners we encounter on web sites based in Europe, but the law mandating them (or at least mandating the gaining of some sort of consent to receive cookies) was introduced for a reason. As the Article 29 Working Party of European Data Protection Authorities noted in 2011

Many public surveys showed, and continue to show, that the average internet user is not aware that his/her behaviour is being tracked with the help of cookies or other unique identifiers, by whom or for what purpose. This lack of awareness contrasts sharply with the increasing dependence of many European citizens on access to internet for ordinary everyday activities

The amendments to the 2002 EC Directive, implemented in domestic law by amendment regulations to the The Privacy and Electronic Communications (EC Directive) Regulations 2003 aimed to ensure that there was “an adequate level of privacy protection and security of personal data transmitted or processed in connection with the use of electronic communications networks” (recital 63). And Article 5 of the Directive specified that

Member States shall ensure that the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information, in accordance with Directive 95/46/EC [the 1995 Data Protection Directive], inter alia, about the purposes of the processing

Of course, the requirement that users of electronic communications networks should give consent to the storing of or gaining access to information stored in their terminal equipment (i.e. that they should consent to the serving of cookies) has not been an easy one to implement, and even the Information Commissioner’s Office’s in 2013 rowed back on attempts to gather explicit consent, claiming that there was now no need because people were more aware of the existence of cookies. But I made what to me was an interesting observation recently when I was asked to advise on a cookie notice for a private company: it appeared to me, as I compared competitors’ sites, that those which had a prominent cookie banner warning actually looked more professional than those that didn’t. So despite my client’s wariness about having a banner, it seemed to me that, ironically, it would actually be of some professional benefit.

I digress.

Just what cookies are and can achieve is brought sharply home in a piece on the Fast Company website, drawing on the findings of a doctoral research student at the University of Pennsylvania. The paper, and the article, describe the use of web analytics, often in the form of information gathered from tracking cookies, for marketing in the health arena in the US. Tim Libert, the paper’s author discovered that

over 90% of the 80,000 health-related pages he looked at on the Internet exposed user information to third parties. These pages included health information from commercial, nonprofit, educational, and government websites…Although personal data is anonymized from these visits, they still lead to targeted advertisements showing up on user’s computers for health issues, as well as giving advertisers leads (which can be deciphered without too much trouble) that a user has certain health issues and what issues those are

The US lacks, of course, federal laws like PECR and the DPA which seek – if imperfectly – to regulate the use of tracking and other cookies. But given that enforcement of the cookie provisions of PECR is largely non-existent, are there similar risks to the privacy of web users’ health information in the UK?

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, cookies, Data Protection, PECR

ACPO: contractor’s error, or data controller’s liability?

I blogged a week or so ago about the worrying fact that the Association of Chief Police Officers (ACPO) were encouraging people to send sensitive personal data over an unsecure HTTP connection.

 a tweet…by Information Security consultant Paul Moore alerted that ACPO’s criminal records office has a website which invites data subjects to make an online request but, extraordinarily, provides by an unencrypted http rather than encrypyted https connection. This is such a basic data security measure that it’s difficult to understand how it has happened…

Well now, thanks to Dan Raywood of ITSecurity Guru, we have a bit more information about how it did happen. Dan had to chase ACPO several times for a comment, and eventually, after he had run the story, they came back to him with the following comment:

The ACPO Criminal Records Office (ACRO) became aware of the situation concerning the provision of personal data over a HTTP rather than a encrypted HTTPS connection on Tuesday February 24. This was caused by a contractual oversight. The Information Commissioner was immediately advised. The secure HTTPS connection was restored on February 25. We apologise for this matter.

It’s good to know that they acted relatively quickly to secure the connection, although one is rather led to wonder whether or when – had not Paul Moore raised the alert – ACPO would have otherwise noticed the problem.

But there is potentially a lot of significance in the words “caused by a contractual oversight”. If ACPO are saying that a contractor is responsible for the website, and that it was the contractor’s error which caused the situation, they should also consider the seventh data protection principle in the Data Protection Act 1998 (DPA), which requires a data controller (which ACPO is, in this instance) to take

Appropriate technical and organisational measures…against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data

but also

Where processing of personal data is carried out by a data processor on behalf of a data controller, the data controller must in order to comply with the seventh principle—

(a)choose a data processor providing sufficient guarantees in respect of the technical and organisational security measures governing the processing to be carried out, and

(b)take reasonable steps to ensure compliance with those measures

What this means is that a failure to choose a data processor with appropriate security guarantees, and a failure to make sure the processor complies with those guarantees, can mean that the data controller itself is liable for those failings. If the failings are of a kind likely to cause substantial damage or substantial distress, then there is potential liability to a monetary penalty notice, to a maximum of £500,000, from the Information Commissioner’s Office (ICO).

In truth, the ICO is unlikely to serve a monetary penalty notice solely because of the likelihood of substantial damage or substantial distress – it is much easier to take enforcement action when actual damage or distress has occurred. Nonetheless, one imagines the ICO will be asking searching questions about compliance with the contract provisions of the seventh principle.

Thanks to IT Security Guru for permission to use the ACPO quote. Their story can be seen here.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under 7th principle, Data Protection, data security, Information Commissioner, police

Attend ICO DP conference, get unsolicited marketing from a hotel…

I greatly enjoyed yesterday’s (2 March 2015) Data Protection Practitioner Conference run by the Information Commissioner’s Office. I was representing NADPO on our stand, and the amount of interest was both gratifying and illustrative of the importance of having a truly representative body for professionals working in the field of information rights. NADPO were at pains – in running our prize draw (winners picked at random on stage by Information Commissioner Christopher Graham) – to make sure we let participants know what would or would not happen with their details. Feedback from delegates about this was also positive, and I’m pleased at least one privacy professional picked up on it.  Therefore the irony of the following events is not lost on me.

I’d stayed overnight on Sunday, in a Macdonald hotel I booked through the agency Expedia. Naturally, I’m not one to encourage the sending to me of direct electronic marketing, and as the unsolicited sending of such marketing is contrary to regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 I didn’t expect to receive any, either from the agent or the hotel. Yet yesterday I did receive some, from the hotel group. So I’ve sent them this complaint:

I booked the hotel through your agent, Expedia.co.uk. As a professional working in the field of privacy and data protection I always make sure I opt out of any electronic marketing. Hence, when making my booking, I checked the Expedia box which said

“Check the box if you do not want to receive emails from Expedia with travel deals, special offers, and other information”.

However, I also consulted their privacy policy, which says:

“Expedia.co.uk may share your information with [suppliers] such as hotel, airline, car rental, and activity providers, who fulfill your travel reservations. Throughout Expedia.co.uk, all services provided by a third-party supplier are described as such. We encourage you to review the privacy policies of any third-party travel supplier whose products you purchase through Expedia.co.uk. Please note that these suppliers also may contact you as necessary to obtain additional information about you, facilitate your travel reservation, or respond to a review you may submit.”

I then consulted Macdonald Hotels’ privacy policy, but this seems to relate only to your website, and is silent on the use of clients’ data passed on by an agent.

Accordingly, I cannot be said to have consented to the sending by you to me of electronic marketing. Yet yesterday at 13.07 I received an email saying “Thank you for registering with Macdonald Hotels and Resorts…As a member of our mailing list you will shortly start to receive [further unsolicited electronic marketing].”

Ironically enough, I was in Manchester to attend the annual Data Protection Practitioners’ Conference run by the Information Commissioner’s Office (ICO). As you will be aware, the ICO regulates compliance with the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR). Before I raise a complaint with the ICO I would appreciate a) your removing me from any marketing database b) not receiving any further unsolicited marketing, and c) receiving your comments regarding your apparent breach of your legal obligations.

Each instance of unsolicited marketing is at best one of life’s minor irritants, but I have concerns that, because of this, some companies treat compliance with legal obligations as, at best, a game in which they try to trick customers into agreeing to receiving marketing, and at worst, as unnecessary. It may be that I received this particular unsolicited marketing from Macdonald Hotels by mistake (although that in itself might raise data protection concerns about the handling of and accuracy of customer data) but it happens too often. The media have rightly picked up on the forthcoming changes to PECR which will make it easier for the ICO to take enforcement actions regarding serious contraventions, but, sadly, I don’t see the lower level, less serious contraventions, decreasing.

The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, Information Commissioner, marketing, PECR