Around this time last year I wrote two blog posts about two separate police forces’ decision to tweet the names of drivers charged (but not – yet, at least – convicted) of drink driving offences. In the latter example Staffordshire police were actually using a hashtag #drinkdriversnamedontwitter, and I argued that
If someone has merely been charged with an offence, it is contrary to the ancient and fundamental presumption of innocence to shame them for that fact. Indeed, I struggle to understand how it doesn’t constitute contempt of court to do so, or to suggest that someone who has not been convicted of drink-driving is a drink driver. Being charged with an offence does not inevitably lead to conviction. I haven’t been able to find statistics relating to drink-driving acquittals, but in 2010 16% of all defendants dealt with by magistrates’ courts were either acquitted or not proceeded against
The Information Commissioner’s Office investigated whether there had been a breach of the first principle of Schedule One of the Data Protection Act 1998 (DPA), which requires that processing of personal data be “fair and lawful”, but decided to take no action after Staffs police agreed not to use the hashtag again, saying
Our concern was that naming people who have only been charged alongside the label ‘drink-driver’ strongly implies a presumption of guilt for the offence. We have received reassurances from Staffordshire Police the hashtag will no longer be used in this way and are happy with the procedures they have in place. As a result, we will be taking no further action.
But my first blog post had raised questions about whether the mere naming of those charged was in accordance with the same DPA principle. Newspaper articles talked of naming and “shaming”, but where is the shame in being charged with an offence? I wondered why Sussex police didn’t correct those newspapers who attributed the phrase to them.
And this year, Sussex police, as well as neighbouring Surrey, and Somerset and Avon are doing the same thing: naming drivers charged with drink driving offences on twitter or elsewhere online. The media happily describe this as a “naming and shaming” tactic, and I have not seen the police disabusing them, although Sussex police did at least enter into a dialogue with me and others on twitter, in which they assured us that their actions were in pursuit of open justice, and that they were not intending to shame people. However, this doesn’t appear to tally with the understanding of the Sussex Police and Crime Commissioner who said earlier this year
I am keen to find out if the naming and shaming tactic that Sussex Police has adopted is actually working
But I also continue to question whether the practice is in accordance with police forces’ obligations under the DPA. Information relating to the commission or alleged commission by a person of an offence is that person’s sensitive personal data, and for processing to be fair and lawful a condition in both of Schedule Two and, particularly, Schedule Three must be met. And I struggle to see which Schedule Three condition applies – the closest is probably
The processing is necessary…for the administration of justice
while the adjective “necessary”, within the meaning of article 10(2) [of the European Convention on Human Rights] is not synonymous with “indispensable”, neither has it the flexibility of such expressions as “admissible”, “ordinary”, “useful”, “reasonable” or “desirable” and that it implies the existence of a “pressing social need.”
should reflect the meaning attributed to it by the European Court of Human Rights when justifying an interference with a recognised right, namely that there should be a pressing social need and that the interference was both proportionate as to means and fairly balanced as to ends
This is another tool in our campaign to stop people driving while under the influence of drink or drugs. If just one person is persuaded not to take to the road as a result, then it is worthwhile as far as we are concerned.
and Sussex police’s Chief Inspector Natalie Moloney says
I hope identifying all those who are to appear in court because of drink or drug driving will act as a deterrent and make Sussex safer for all road users
which firstly fails to use the word “alleged” before “drink or drug driving”, and secondly – as Supt Corrigan – suggests the purpose of naming is not to promote open justice, but rather to deter drink drivers.
Deterring drink driving is certainly a worthy public aim (and I stress that I have no sympathy whatsoever with those convicted of such offences) but should the sensitive personal data of who have not been convicted of any offence be used to their detriment in pursuance of that aim?
I worry that unless such naming practices are scrutinised, and challenged when they are unlawful and unfair, the practice will spread, and social “shame” will be encouraged to be visited on the innocent. I hope the Information Commissioner investigates.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
Filed under Data Protection, human rights, Information Commissioner, Open Justice, police, social media
ICO confirm they are considering enforcement action over #samaritansradar app
FOI response from ICO refuses disclosure of correspondence with Samaritans because it could prejudice ongoing investigations
On 12 November I asked the Information Commissioner’s Office to disclose to me, under the Freedom of Information Act (FOIA) information relating to their assessment of the legality of the “Samaritans Radar” app (see blog posts passim).
The ICO have now responded to me, refusing to disclose because of the FOIA exemption for “law enforcement”. As the ICO say
The exemption at section 31(1)(g) of the FOIA refers to circumstances
where the disclosure of information “would, or would be likely to,
prejudice – … the exercise by any public authority of its functions for
any of the purposes specified in subsection (2).”The purposes referred to in sections 31(2)(a) and (c) are –
“(a) the purpose of ascertaining whether any person has failed to comply
with the law” and“(c) the purpose of ascertaining whether circumstances which would
justify regulatory action in pursuance of any enactment exist or may arise
…”Clearly, these purposes apply when the Information Commissioner is
considering whether or not an organisation has breached the Data Protection Act
But the exemption is subject to a public interest test, and the ICO acknowledge that there is public interest in the matter, particularly in how Samaritans have responded to their enquiries. Nonetheless, as the investigation is ongoing, and as no decision has apparently been made about whether enforcement action should be taken, the balance in the public interest test falls on the side of non-disclosure.
The question of potential enforcement action is an interesting one. Although the ICO have power to serve monetary penalty notices (to a maximum of £500,000) they can also issue enforcement notices, requiring organisations (who are data controllers, as I maintain Samaritans were for the app) to cease or not begin processing personal data for specific purposes. They also can ask data controllers to sign undertakings to take or not take specific action. This is of interest because Samaritans have indicated that they might want to launch a reworked version of the app.
It is by no means certain that enforcement action will result – the ICO are likely to be reluctant to enforce against a generally admirable charity – but the fact that it is being considered is in itself of interest.
The ICO acknowledge that the public interest in maintaining this particular exemption wanes once the specific investigation has been completed. Consequently I have asked them, outwith FOIA, to commit to disclosing this information proactively once the investigation has finished. They have no obligation to do so, but it would be to the benefit of public transparency, which their office promotes, if they did.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
PARKLIFE! (and a £70k monetary penalty)
In August this year I reported that the Information Commissioner’s Office (ICO) had effectively conceded it had no current powers to issue monetary penalties on spam texters. This was after the Upper Tribunal had indicated that in most cases the sending of such texts was not likely to cause substantial damage or substantial distress (this being part of the statutory test for serving a monetary penalty notice (MPN) for a serious contravention of the Privacy and Electronic Communications (EC Directive) Regulations 2003) (PECR).
What I’d forgotten were the reports of highly distasteful and in some cases highly distressing texts sent in May to festival-goers by the organisers of the Parklife festival in Manchester’s Heaton Park. The texts didn’t disclose that they were from the event organisers, but instead purported to come from “Mum” and were advertising extra events at the festival.
Regulation 23 of PECR outlaws the sending of direct marketing texts (and other direct marketing electronic communications) where the sender’s identity has been disguised or concealed.
As the Manchester Evening News reported at the time receiving the texts in question left many recipients who had lost their mothers distressed and upset.
And so it came to pass that, as the same newspaper reveals today, the ICO investigated complaints about the marketing, and appears to have determined that the sending of the texts was a serious contravention of PECR regulation 23, and it was of a kind likely to cause substantial distress. The paper reveals that an MPN of £70000 has been served on the organisers, and the ICO has confirmed this on its website, and the MPN itself lists a number of the complaints made by affected recipients.
So, I, and the ICO’s Steve Eckersley, were wrong – powers to serve MPNs for spam texts do still currently exist, although it must be said that this was an exceptional case: most spam texts are irritating, rather than as callous and potentially distressing as these. And this is why the Ministry of Justice is, as I have previously discussed, consulting on lowering, or dropping altogether, the “harm threshold” for serving MPNs for serious PECR contraventions.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
Does Simon Hughes really want to receive FOI complaints?
At an event on the evening of 26 November, to celebrate (slightly early) the ten year anniversary of the Freedom of Information Act 2000 (FOIA) the Minister of State for Justice and Civil Liberties, Simon Hughes, appeared to offer to take on part of the Information Commissioner’s regulatory role.
The event, hosted at the RSA by the Commissioner himself, brought together a panel of FOIA luminaries consisting of Deputy Information Commissioner Graham Smith, the BBC’s Martin Rosenbaum, Scottish Information Commissioner Rosemary Agnew and Hughes himself. In response to a question from the floor about the considerable delays and obstructiveness by certain public authorities in dealing with FOIA requests, Hughes invited people to send him examples, so that he could start to compile data on compliance (of the sort already being compiled by Agnew’s office).
Astute eyebrows at the event (and possibly on the panel) were raised: dealing with miscreant public authorities is a role clearly assigned to the Information Commissioner. For the Minister to invite complaints seems to be to risk usurping that role. One wonders if he knows what he’s let himself in for.
Watching the detective
The ICO might be mostly powerless to take action against the operators of the Russian web site streaming unsecured web cams, but the non-domestic users of the web cams could be vulnerable to enforcement action
The Information Commissioner’s Office (ICO) warned yesterday of the dangers of failing to secure web cams which are connected to the internet. This was on the back of stories about a Russian-based web site which aggregates feeds from thousands of compromised cameras worldwide.
This site was drawn to my attention a few weeks ago, and, although I tweeted obliquely about it, I thought it best not to identify it because of the harm it could potentially cause. However, although most news outlets didn’t identify the site, the cat is now, as they say, out of the bag. No doubt this is why the ICO chose to issue sensible guidance on network security in its blog post.
I also noticed that the Information Commissioner himself, Christopher Graham, rightly pointed to the difficulties in shutting down the site, and the fact that it is users’ responsibility to secure their web cams:
It is not within my jurisdiction, it is not within the European Union, it is Russia.
I will do what I can but don’t wait for me to have sorted this out.
This is, of course, true, and domestic users of web cams would do well to note the advice. Moreover, this is just the latest of these aggregator sites to appear. But news reports suggested that some of the 500-odd (or was it 2000-odd?) feeds on the site from the UK were from cameras of businesses or other non-domestic users (I saw a screenshot, for instance, of a feed from a pizza takeaway). Those users, if their web cams are capturing images of identifiable individuals, are processing personal data in the role of a data controller. And they can’t claim the exemption in the Data Protection Act 1998 (DPA) that applies to processing for purely domestic purposes. They must, therefore comply with the seventh data protection principle, which requires them to take appropriate measures to safeguard against unauthorised and unlawful processing of personal data. Allowing one’s web can to be compromised and its feed streamed on a Russian website is a pretty good indication that one is not complying with the seventh principle. Serious contraventions of the obligation to comply with the data protection principles can, of course, lead to ICO enforcement action, such as monetary penalty notices, to a maximum of £500,000.
The ICO is not, therefore, completely powerless here. Arguably it should be (maybe it is?) looking at the feeds on the site to determine which are from non-domestic premises, and looking to take appropriate enforcement action against them. So to that extent, one is rather watching Mr Graham, to see if he can sort this out.
Filed under Data Protection, Information Commissioner, Privacy
The voluntary data controller
One last post on #samaritansradar. I hope.
I am given to understand that Samaritans, having pulled their benighted app, have begun responding to the various legal notices people served on them under the Data Protection Act 1998 (specifically, these were notices under section 7 (subject access) section 10 (right to prevent processing likely to cause damage or distress) and section 12 (rights in relation to automated processing)). I tweeted my section 12 notice, but I doubt I’ll get a response to that, because they’ve never engaged with me once on twitter or elsewhere.
However, I have been shown a response to a section 7 request (which I have permission to blog about) and it continues to raise questions about Samaritans’ handling of this matter (and indeed, their legal advice – which hasn’t been disclosed, or even really hinted at). The response, in relevant part, says
We are writing to acknowledge the subject access request that you sent to Samaritans via DM on 6 November 2014. Samaritans has taken advice on this matter and believe that we are not a data controller of information passing through the Samaritans Radar app. However, in response to concerns that have been raised, we have agreed to voluntarily take on the obligations of a data controller in an attempt to facilitate requests made as far as we can. To this end, whilst a Subject Access Request made under the Data Protection Act can attract a £10 fee, we do not intend to charge any amount to provide information on this occasion.
So, Samaritans continue to deny being data controller for #samaritansradar, although they continue also merely to give assertions, not any legal analysis. But, notwithstanding their belief that they are not controller they are taking on the obligations of a data controller.
I think they need to be careful. A person who knowingly discloses personal data without the consent of the data controller potentially commits a criminal offence under section 55 DPA. One can’t just step in, grab personal data and start processing it, without acting in breach of the law. Unless one is a data controller.
And, in seriousness, this purported adoption of the role of “voluntary data controller” just bolsters the view that Samaritans have been data controllers from the start, for reasons laid out repeatedly on this blog and others. They may have acted as joint data controller with users of the app, but I simply cannot understand how they can claim not to have been determining the purposes for which and the manner in which personal data were processed. And if they were, they were a data controller.
Filed under Data Protection, social media
Do your research. Properly
Campaigning group Big Brother Watch have released a report entitled “NHS Data Breaches”. It purports to show the extent of such “breaches” within the NHS. However it fails properly to define its terms, and uses very questionable methodology. I think, most worryingly, this sort of flawed research could lead to a reluctance on the part of public sector data controllers to monitor and record data security incidents.
As I checked my news alerts over a mug of contemplative coffee last Friday morning, the first thing I noticed was an odd story from a Bedfordshire news outlet:
Bedford Hospital gets clean bill of health in new data protection breach report, unlike neighbouring counties…From 2011 to 2014 the hospital did not breach the data protection act once, unlike neighbours Northampton where the mental health facility recorded 346 breaches, and Cambridge University Hospitals which registered 535 (the third worst in the country).
Elsewhere I saw that one NHS Trust had apparently breached data protection law 869 times in the same period, but many others, like Bedford Hospital had not done so once. What was going on – are some NHS Trusts so much worse in terms of legal compliance than others? Are some staffed by people unaware and unconcerned about patient confidentiality? No. What was going on was that campaigning group Big Brother Watch had released a report with flawed methodology, a misrepresentation of the law and flawed conclusions, which I fear could actually lead to poorer data protection compliance in the future.
I have written before about the need for clear terminology when discussing data protection compliance, and of the confusion which can be caused by sloppiness. The data protection world is very found of the word “breach”, or “data breach”, and it can be a useful term to describe a data security incident involving compromise or potential compromise of personal data, but the confusion arises because it can also be used to describe, or assumed to apply to, a breach of the law, a breach of the Data Protection Act 1998 (DPA). But a data security incident is not necessarily a breach of a legal obligation in the DPA: the seventh data protection principle in Schedule One requires that
Appropriate technical and organisational measures shall be taken [by a data controller] against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data
And section 4(4) of the DPA obliges a data controller to comply with the Schedule One data protection principles. This means that when appropriate technical and organisational measures are taken but unauthorised or unlawful processing, or accidental loss or destruction of, or damage to, personal data nonetheless occurs, the data controller is not in breach of its obligations (at least under the seventh principle). This distinction between a data security incident, and a breach, or contravention, of legal obligations, is one that the Information Commissioner’s Office (ICO) itself has sometimes failed to appreciate (as the First-tier Tribunal found in the Scottish Borders Council case EA/2012/0212). Confusion only increases when one takes into account that under The Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) which are closely related to the DPA, and which deal with data security in – broadly – the telecoms arena, there is an actual legislative provision (regulation 2, as amended) which talks in terms of a “personal data breach”, which is
a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed in connection with the provision of a public electronic communications service
and regulation 5A obliges a relevant data controller to inform the ICO when there has been a “personal data breach”. It is important to note, however, that a “personal data breach” under PECR will not be a breach, or contravention, of the seventh DPA data protection principle, provided the data controller took appropriate technical and organisational to safeguard the data.
Things get even more complex when one bears in mind that the draft European General Data Protection Regulation proposes a similar approach as PECR, and defines a “personal data breach” in similar terms as above (simply removing the words “in connection with the provision of a public electronic communications service“).
Notwithstanding this, the Big Brother Watch report is entitled “NHS Data Breaches”, so one would hope that it would have been clear about its own terms. It has led to a lot of coverage, with media outlets picking up on headline-grabbing claims of “7225 breaches” in the NHS between 2011 and 2014, which is the equivalent to “6 breaches a day”. But when one looks at the methodology used, serious questions are raised about the research. It used Freedom of Information requests to all NHS Trusts and Bodies, and the actual request was in the following terms
1. The number of a) medical personnel and b) non-medical personnel that have been convicted for breaches of the Data Protection Act.
2. The number of a) medical personnel and b) non-medical personnel that have had their employment terminated for breaches of the Data Protection Act.
3. The number of a) medical personnel and b) non-medical personnel that have been disciplined internally but have not been prosecuted for breaches of the Data Protection Act.
4. The number of a) medical personnel and b) non-medical personnel that have resigned during disciplinary procedures.
5. The number of instances where a breach has not led to any disciplinary action.
The first thing to note is that, in broad terms, the only way that an individual NHS employee can “breach the Data Protection Act” is by committing a criminal offence under section 55 of unlawfully obtaining personal data without the consent of the (employer) data controller. All the other relevant legal obligations under the DPA are ones attaching to the NHS body itself, as data controller. Thus, by section 4(4) the NHS body has an obligation to comply with the data protection principles in Schedule One of the DPA, not individual employees. And so, except in the most serious of cases, where an employee acts without the consent of the employer to unlawfully obtain personal data, individual employees, whether medical or non-medical personnel, cannot as a matter of law “breach the Data Protection Act”.
One might argue that it is easy to infer that what Big Brother Watch meant to ask for was information about the number of times when actions of individual employees meant that their employer NHS body had breached its obligations under the DPA, and, yes, that it probably what was meant, but the incorrect terms and lack of clarity vitiated the purported research from the start. This is because NHS bodies have to comply with the NHS/Department of Health Information Governance Toolkit. This toolkit actually requires NHS bodies to record serious data security incidents even where those incidents did not, in fact, constitute a breach of the body’s obligations under the DPA (i.e. incidents might be recorded which were “near misses” or which did not constitute a failure of the obligation to comply with the seventh, data security, principle).
The results Big Brother Watch got in response to their ambiguous and inaccurately termed FOI request show that some NHS bodies clearly interpreted it expansively, to encompass all data security incidents, while others – those with zero returns in any of the fields, for instance – clearly interpreted it restrictively. In fact, in at least one case an NHS Trust highlighted that its return included “near misses”, but these were still categorised by Big Brother Watch as “breaches”.
And this is not unimportant: data security and data protection are of immense importance in the NHS, which has to handle huge amounts of highly sensitive personal data, often under challenging circumstances. Awful contraventions of the DPA do occur, but so too do individual and unavoidable instances of human error. The best data controllers will record and act on the latter, even though they don’t give rise to liability under the DPA, and they should be applauded for doing so. Naming and shaming NHS bodies on the basis of such flawed research methodology might well achieve Big Brother Watch’s aim of publicising its call for greater sanctions for criminal offences, but I worry that it might lead to some data controllers being wary of recording incidents, for fear that they will be disclosed and misinterpreted in the pursuit of questionable research.
Filed under Data Protection, Freedom of Information, Information Commissioner, NHS
A strict test for compliance with access to information laws
The High Court has quashed planning permission for a wind turbine because the Council involved failed to make information available beforehand, in breach of its legal obligations
The statutory rights to information held by public authorities which commenced in January 2005 – when the Freedom of Information Act 2000 and the Environmental Information Regulations 2004 came into effect – are not the only legal mechanism whereby people can or must have public information imparted to them. For instance, sections 100A-E of the Local Government Act 1972 (as inserted by the Local Government (Access to Information) Act 1985) deal with access to meetings of and information relating to meetings of specified local authorities (broadly, County, Borough, District, City or Unitary Councils). Section 100B deals with access to agendas and reports and section 100D with access to background papers. In both cases these must be “open to inspection by members of the public at the offices of the council” at least five clear days before the meeting (“clear days” refers to weekday working days and does not include the day of publication or the day of the meeting (R v Swansea City Council, ex p Elitestone Ltd (1993) 66 P. & C.R. 422)).
But what happens if these obligations are not complied with? what, for example, happens if background papers are not available for inspection for five clear days before a meeting? Often, nothing happens at all, but sometimes such a failure can be significant and costly. In a recent case (Joicey, R (on the Application of) v Northumberland County Council [2014] EWHC 3657) this is exactly what transpired. A planning application for a wind turbine was at issue,1 with a meeting scheduled for 5 November 2013 to consider it. The judgment informs us that “the officer’s report recommending approval…subject to conditions, was made available on 23 October” (it is not clear whether this means made available only for inspection, or whether it was also available on the Council’s website, although nothing turns on this). A Dr Ferguson, opposing the application (and a friend of the applicant Mr Joicey) noticed from the officer report that an external noise assessment report had been commissioned and produced. He emailed the Council on 30 October asking about the noise assessment report, getting no immediate reply, and attended the Council offices on 1 November to inspect the files, but no noise assessment report was included. On 4 November, the day before the committee meeting, he received a reply to his 30 October email, with a copy of the noise assessment report attached. The same day a copy of the report was uploaded to the Council website.
The committee approved the application, despite Mr Joicey addressing the meeting in the following terms
Noise impact assessment has been carried out again, in full, for this application, but I don’t suppose any of you have seen it, because this highly relevant document (74 pages of it) appeared only yesterday, and that was after requests to see it. If you study it, and you are properly armed with the knowledge of previous planning history connected with this site, you will find that it is actually fundamentally flawed, again, and that it shows that this application must actually be refused on noise grounds.
Mr Joicey brought judicial review proceedings on six grounds, but the one which concerns us here is the first: the non-availability of the noise assessment. As the noise assessment report was not included in a list of the background papers for the report to the committee, and was not available for inspection five clear days before the meeting there was, said Mr Justice Cranston
no doubt that there were a number of breaches of the public’s right to know under the Local Government Act 1972
Furthermore, the fact that the report was not available on the Council’s website was a breach of its undertakings in its Statement of Community Involvement (SCI) prepared pursuant to its obligations under section 18(1) of the Planning and Compulsory Purchase Act 2004. The Council’s SCI stated that “Once a valid planning application has been received we will…Publish details of the application with supporting documentation on the council website.” The Council even conceded that, although the report had been uploaded on 4 November, it had been described as published on 9 September, and the judge took a “dim view of any public authority backdating a document in a manner which could give a false impression to the public”. The undertaking in the SCI went further, said the judge, than the statutory obligations in the 1972 Act, and constituted a continuing promise giving rise to a legitimate expectation on the part of the public, and “otherwise the public’s right to know what is being proposed regarding a planning application would be frustrated”.
But what was the effect of these failings? The Council submitted that no prejudice had been caused to the claimant, because the planning committee’s decision had been inevitable and, adopting the test in Bolton MBC v Secretary of State for the Environment (1990) 61 P. & C.R. 343, if the court was uncertain whether, absent the failings, there would be a real possibility of a different decision being there was no basis for concluding that it was invalid. However, Mr Justice Cranston held that the correct test was different: drawing on the authorities of Simplex GE Holdings Ltd v Secretary of State for Environment (1988) 3 PLR 25 and R (on the application of Holder) v Gedding District Council [2014] EWCA Civ 599 he said that
the claimant will be entitled to relief unless the decision-maker can demonstrate that the decision it took would inevitably have been the same had it complied with its statutory obligation to disclose information in a timely fashion [emphasis not in original]
And in this case the Council failed to persuade him that the decision would inevitably have been the same if the noise assessment report had been made available earlier: the issue of noise had been a key one in earlier challenges to the developments and remained so now, and Mr Joicey could have made further representations and sought further expert opinion which might have persuaded the planning committee.
Some of Mr Joicey’s other grounds of challenge succeeded, and some failed, but the merits of the successful challenges led to the planning permission being quashed.
Local authorities would do well to note the strictness of the test here: breaches of the access to information provisions of the 1972 Local Government Act, and of the undertakings in a Statement of Community Involvement, will mean decisions taken are liable to be quashed upon challenge, unless the decision would inevitably have been the same without the breaches. Inevitability is a hard thing to prove.
1Northumberland County Council, despite its name, is a unitary authority, and, therefore, a local planning authority
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
Filed under local government, transparency
So farewell then #samaritansradar…
…or should that be au revoir?
With an interestingly timed announcement (18:00 on a Friday evening) Samaritans conceded that they were pulling their much-heralded-then-much–criticised app “Samaritans Radar”, and, as if some of us didn’t feel conflicted enough criticising such a normally laudable charity, their Director of Policy Joe Ferns managed to get a dig in, hidden in what was purportedly an apology:
We are very aware that the range of information and opinion, which is circulating about Samaritans Radar, has created concern and worry for some people and would like to apologise to anyone who has inadvertently been caused any distress
So, you see, it wasn’t the app, and the creepy feeling of having all one’s tweets closely monitored for potentially suicidal expressions, which caused concern and worry and distress – it was all those nasty people expressing a range of information and opinion. Maybe if we’d all kept quiet the app could have continued on its unlawful and unethical merry way.
However, although the app has been pulled, it doesn’t appear to have gone away
We will…be testing a number of potential changes and adaptations to the app to make it as safe and effective as possible for both subscribers and their followers
There is a survey at the foot of this page which seeks feedback and comment. I’ve completed it, and would urge others to do so. I’ve also given my name and contact details, because one of my main criticisms of the launch of the app was that there was no evidence that Samaritans had taken advice from anyone on its data protection implications – and I’m happy to do so for no fee. As Paul Bernal says, “[Samaritans] need to talk to the very people who brought down the app: the campaigners, the Twitter activists and so on”.
Data protection law’s place in our digital lives is of profound importance, and of profound interest to me. Let’s not forget that its genesis in the 1960s and 1970s was in the concerns raised by the extraordinary advances that computing brought to data analysis. For me some of the most irritating counter-criticism during the recent online debates about Samaritans Radar was from people who equated what the app did to mere searching of tweets, or searching for keywords. As I said before, the sting of this app lay in the overall picture – it was developed, launched and promoted by Samaritans – and in the overall processing of data which went on – it monitored tweets, identified potentially worrying ones and pushed this information to a third party, all without the knowledge of the data subject.
But also irritating were comments from people who told us that other organisations do similar analytics, for commercial reasons, so why, the implication went, shouldn’t Samaritans do it for virtuous ones? It is no secret that an enormous amount of analysis takes place of information on social media, and people should certainly be aware of this (see Adrian Short’s excellent piece here for some explanation), but the fact that it can and does take place a) doesn’t mean that it is necessarily lawful, nor that the law is impotent within the digital arena, and b) doesn’t mean that it is necessarily ethical. And for both those reasons Samaritans Radar was an ill-judged experiment that should never have taken place as it did. If any replacement is to be both ethical and lawful a lot of work, and a lot of listening, needs to be done.
The views in this post (and indeed all posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.
Filed under Data Protection, social media
