Tag Archives: directive 95/46/EC

Do bloggers need to register with the ICO?

A strict reading of data protection law suggests many (if not all) bloggers should register with the ICO, even though the latter disagrees. And, I argue, the proposal for an Information Rights Levy runs the risk of being notification under a different name

Part III of the Data Protection Act 1998 (DPA) gives domestic effect to Article 18 of the European Data Protection Directive (the Directive). It describes the requirement that data controllers notify the fact that they are processing personal data, and the details of that processing, to the Information Commissioner’s Office (ICO). It is, on one view, a rather quaint throwback to the days when processing of personal data was seen as an activity undertaken by computer bureaux (a term found in the predecessor Data Protection Act 1984). However, it is law which is very much in force, and processing personal data without a valid notification, in circumstances where the data controller had an obligation to notify, is a criminal offence (section 21(1) DPA). Moreover, it is an offence which is regularly prosecuted by the ICO (eleven such prosecutions so far this year).

These days, it is remarkably easy to find oneself in the position of being a data controller (“a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed”). There are, according to the ICO, more than 370,000 data controllers registered. Certainly, if you are a commercial enterprise which in any way electronically handles personal data of customers or clients it is almost inevitable that you will be a data controller with an obligation to register. The exemptions to registering are laid out in regulations, and are quite restrictive – they are in the main, the following (wording taken from the ICO Notification Handbook)

Data controllers who only process personal information for: staff administration (including payroll); advertising, marketing and public relations (in connection with their own business activity); and accounts and records.
Some not-for-profit organisations.
Maintenance of a public register.
Processing personal information for judicial functions.
Processing personal information without an automated system such
as a computer.
But there is one other, key exemption. This is not within the notification regulations, but at section 36 of the DPA itself, and it exempts personal data from the whole of the Act if it is
processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes)
Thus, if you, for instance, keep a record of your children’s medical histories on your home computer, you are not caught by any of the DPA (and not required to notify with the ICO).Where this becomes interesting (it does become interesting, honestly) is when the very expansive interpretation the ICO gives to this “domestic purposes exemption” is considered in view of the extent to which people’s domestic affairs – including recreational purposes – now take place in a more public sphere, whereby large amounts of information are happily published by individuals on social media. As I have written elsewhere, the Court of Justice of the European Union (CJEU) held in 2003, in the Lindqvist case, that the publishing of information on the internet could not be covered by the relevant domestic purposes exemption in the Directive. The ICO and the UK has, ever since, been in conflict with this CJEU authority, a point illustrated by the trenchant criticism delivered in the High Court in the judgment by Tugendhat J in The Law Society v Kordowski.

But I think there is a even more stark illustration of the implications of an expansive interpretation of the section 36 exemption, and I provide it. On this blog I habitually name and discuss identifiable individuals – this is processing of personal data, and I determine the purposes for which, and the manner in which, this personal data is processed. Accordingly, I become a data controller, according to the definitions at section 1(1) of the DPA. So, do I need to notify my processing with the ICO? The answer, according to the ICO, is “no”. They tell me

from the information you have provided it would be unlikely that you would be required to register in respect of your blogs and tweets
But I don’t understand this. I cannot see any exemption which applies to my processing – unless it is section 36. But in what way can I seriously claim that I am processing personal data only for my domestic (including recreational) purposes. Yes, blogging about information rights is partly a recreation to me (some might say that makes me odd) but I cannot pretend that I have no professional aims and purposes in doing so. Accordingly, the processing cannot only be for domestic purposes.I have asked the ICO to confirm what, in their view, exempts me from notification. I hope they can point me to something I have overlooked, because, firstly, anything that avoids my having to pay an annual notification fee of £35 would be welcome, and secondly, I find it rather uncomfortable to be on the receiving end of my own personal analysis that I’m potentially committing a criminal offence, even if the lead prosecutor assures me I’m not.

The point about the notification fee leads to me on to a further issue. As I say above, notification is in some ways rather quaint – it harks back to days when processing of personal data was a specific, discrete activity, and looks odd in a world where, with modern technology, millions of activities every day meet the definition of “processing personal data”. No doubt for these reasons, the concept of notification with a data protection authority is missing from the draft General Data Protection Regulation (GDPR) currently slouching its way through the European legislative process. However, a proposal by the ICO suggests that, at least in the domestic sphere, notification (in another guise), might remain under new law.The ICO, faced with the fact that its main funding stream (the annual notification fees from those 370,000-plus data controllers) would disappear if the GDPR is passed in its proposed form, is lobbying for an “information rights levy”. Christopher Graham said earlier this year

I would have thought  an information rights levy, paid for by public authorities and data controllers [is needed]. We would be fully accountable to Parliament for our spending.

and the fact that this proposal made its way into the ICO’s Annual Report  with Graham saying that Parliament needs to “get on with the task” of establishing the levy, suggests that it might well be something the Ministry of Justice agrees with. As the MoJ would be first in line to have make up the funding shortfall if a levy wasn’t introduced, it is not difficult to imagine it becoming a reality.

On one view, a levy makes perfect sense – a “tax” on those who process personal data. But looked at another way, it will potentially become another outmoded means of defining what a data controller is. One cannot imagine that, for instance, bloggers and other social media users will be expected to pay it, so it is likely that, in effect, those data controllers whom the ICO currently expects to notify will be those who are required to pay the levy. One imagines, also, that pour encorager les autres, it might be made a criminal offence not to pay the levy in circumstances where a data controller should pay it but fails to do so. In reality, will it just be a mirror-image of the current notification regime?

And will I still be analysing my own blogging as being processing that belongs to that regime, but with the ICO, for pragmatic, if not legally sound, reasons, deciding the opposite?

1 Comment

Filed under Data Protection, Directive 95/46/EC, Europe, GDPR, parliament

DVLA, disability and personal data

Is the DVLA’s online vehicle-checker risking the exposure of sensitive personal data of registered keepers of vehicles?

The concept of “personal data”, in the Data Protection Act 1998 (DPA) (and, beyond, in the European Data Protection Directive EC/95/46) can be a slippery one. In some cases, as the Court of Appeal recognised in Edem v The Information Commissioner & Anor [2014] EWCA Civ 92 where it had to untangle a mess that the First-tier tribunal had unnecessarily got itself into, it is straightforward: someone’s name is their personal data. In other cases, especially those which engage the second limb of the definition in section 1(1) of the DPA (“[can be identified] from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller” it can be profoundly complex (see the House of Lords in Common Services Agency v Scottish Information Commissioner (Scotland) [2008] UKHL 47, a judgment which, six years on, still makes data protection practitioners wake up in the night screaming).

When I first looked at the reports that the DVLA’s Vehicle Tax Check service enabled people to see whether the registered owner of a car was disabled, I thought this might fall into the complex category of data protection issues. On reflection, I think it’s relatively straightforward.

I adopt the excellent analysis by the benefitsandwork.co.uk site

A new vehicle check service on the DVLA website allows visitors to find out whether their neighbours are receiving the higher rate of the mobility component of disability living allowance (DLA) or either rate of the mobility component of personal independence payment (PIP)…The information that DVLA are making available is not about the vehicle itself. Instead they are publishing personal information about the benefits received by the individual who currently owns the car or for whom the car is solely used.

It’s difficult to argue against this, although it appears the DVLA are trying, because they responded to the initial post by saying

The Vehicle Enquiry Service does not include any personal data. It allows people to check online what information DVLA holds about a vehicle, including details of the vehicle’s tax class to make sure that local authorities and parking companies do not inadvertently issue parking penalties where parking concessions apply. There is no data breach – the information on a vehicle’s tax class that is displayed on the Vehicle Enquiry Service does not constitute personal data. It is merely a descriptive word for a tax class

but, as benefitsandwork say, that is only true insofar as the DVLA are publishing the tax band of the car, but when they are publishing that the car belongs to a tax-exempt category for reasons of the owner’s disability, they are publishing something about the registered keeper (or someone they care for, or regularly drive), and that is sensitive personal data.

What DVLA is doing is not publishing the car’s tax class – that remains the same whoever the owner is – they are publishing details of the exempt status of the individual who currently owns it. That is personal data about the individual, not data about the vehicle

As the Information Commissioner’s guidance (commended by Moses LJ in Edem) says

Is the data being processed, or could it easily be processed, to: learn; record; or decide something about an identifiable individual, or; as an incidental consequence of the processing, either: could you learn or record something about an identifiable individual; or could the processing have an impact on, or affect, an identifiable individual

Ultimately benefitsandwork’s example (where someone was identified from this information) unavoidably shows that the information can be personal data: if someone can search the registration number of a neighbour’s car, and find out that the registered keeper is exempt from paying the road fund licence for reasons of disability, that information will be the neighbour’s personal data, and it will have been disclosed to them unfairly, and in breach of the DPA (because no condition for the disclosure in Schedule 3 exists).

I hope the DVLA will rethink.

 

11 Comments

Filed under Confidentiality, Data Protection, Directive 95/46/EC, disability, Information Commissioner, Privacy

We’re looking into it

The news is awash with reports that the UK Information Commissioner’s Office (ICO) is “opening an investigation” into Facebook’s rather creepy research experiment, in conjunction with US universities, in which it apparently altered the users’ news feeds to elicit either positive or negative emotional responses. Thus, the BBC says “Facebook faces UK probe over emotion study”, SC Magazine says “ICO probes Facebook data privacy” and the Financial Times says “UK data regulator probes Facebook over psychological experiment”.

As well as prompting one to question some journalists’ obsession with probes, this also leads one to look at the basis for these stories. It appears to lie in a quote from an ICO spokesman, given I think originally to the online IT news outlet The Register

The Register asked the office of the UK’s Information Commissioner if it planned to probe Facebook following widespread criticism of its motives.

“We’re aware of this issue, and will be speaking to Facebook, as well as liaising with the Irish data protection authority, to learn more about the circumstances,” a spokesman told us.
So, the ICO is aware of the issue and will be speaking to Facebook and to the Irish Data Protection Commissioner’s office. This doesn’t quite match up to the rather hyperbolic news headlines. And there’s a good reason for this – the ICO is highly unlikely to have any power to investigate, let alone take action. Facebook, along with many other tech/social media companies, has its non-US headquarters in Ireland. This is partly for taxation reasons and partly because of access to high-skilled, relatively low cost labour. However, some companies – Facebook is one, LinkedIn another – have another reason, evidenced by the legal agreements that users enter into: because the agreement is with “Facebook Ireland”, then Ireland is deemed to be the relevant jurisdiction for data protection purposes. And, fairly or not, the Irish data protection regime is generally perceived to be relatively “friendly” towards business.
 
These jurisdictional issues are by no means clear cut – in 2013  a German data protection authority tried to exercise powers to stop Facebook imposing a “real name only” policy.
 
Furthermore, as the Court of Justice of the European Union recognised in the recent Google Spain case, the issue of territorial responsibilities and jurisdiction can be highly complex. The Court held there that, as Google had
 
[set] up in a Member State a branch or subsidiary which is intended to promote and sell advertising space offered by that engine and which orientates its activity towards the inhabitants of that Member State
 
it was processing personal data in that Member State (Spain). Facebook does have a large UK corporate office with some responsibility for sales. It is just possible that this could give the ICO, as domestic data protection authority, some power to investigate. And if or when the draft European General Data Protection Regulation gets passed, fundamental shifts could take place, extending even, under Article 3(2) to bringing data controllers outside the EU within jurisdiction, where they are offering goods or services to (or monitoring) data subjects in the EU.
 
But the question here is really whether the ICO will assert any purported power to investigate, when the Irish DPC is much more clearly placed to do so (albeit it with terribly limited resources). I think it’s highly unlikely, despite all the media reports. In fact, if the ICO does investigate, and it leads to any sort of enforcement action, I will eat my hat*.
 
*I reserve the right to specify what sort of hat

Leave a comment

Filed under Data Protection, Directive 95/46/EC, enforcement, facebook, journalism, social media, Uncategorized

Google is not a library, Dr Cavoukian

The outgoing Ontario Information and Privacy Commissioner Ann Cavoukian, whose time in office has been hugely, and globally, influential (see in particular Privacy by Design) has co-written (with Christopher Wolf) an article strongly criticising the judgment of the Court of Justice of the European Union (CJEU) in the Google Spain case.

For anyone who has been in the wilderness for the last few weeks, in Google Spain the CJEU ruled that Google Spain, as a subsidiary of Google inc. operating on Spanish territory, was covered by the obligations of the European Data Protection Directive 95/46/EC, that it was operating as an entity that processed personal data in the capacity of a data controller, and that it was accordingly required to consider applications from data subjects for removal of search returns. Thus, what is loosely called a “right to be forgotten” is seen already to exist in the current data protection regime.

Many have written on this landmark CJEU ruling (I commend in particular Dr David Erdos’s take, on the UK Constitutional Law Blog) and I am not here going to go into any great detail, but what I did take issue with in the Cavoukian and Wolf piece was the figurative comparison of Google with a public library:

A man walks into a library. He asks to see the librarian. He tells the librarian there is a book on the shelves of the library that contains truthful, historical information about his past conduct, but he says he is a changed man now and the book is no longer relevant. He insists that any reference in the library’s card catalog and electronic indexing system associating him with the book be removed, or he will go to the authorities…

…The government agent threatens to fine or jail the librarian if he does not comply with the man’s request to remove the reference to the unflattering book in the library’s indexing system.

Is this a scenario out of George Orwell’s Nineteen Eighty-Four? No, this is the logical extension of a recent ruling from Europe’s highest court

(I pause briefly to say that if I never see another reference to Orwell in the context of privacy debate I will die a happy man).

I’m fond of analogies but Cavoukian’s and Wolf’s one (or maybe it’s a metaphor?) is facile. I think it could more accurately say

A man walks into a library. He sees that, once again, the library has chosen, because of how it organises its profit-making activities, to give great prominence to a book which contains information about his past conduct, which is no longer relevant, and which it is unfair to highlight. He asks them to give less prominence to it.

Cavoukian and Wolf accept that there should be a right to remove “illegal defamatory” content if someone posts it online, but feel that the issue of links to “unflattering, but accurate” information should be explored using “other solutions”. (I pause again to note that “unflattering” is an odd and loaded word to use here: Mr Gonzalez, in the Google Spain case, was concerned about out-of-date information about bankruptcy, and other people who might want to exercise a right to removal of links might be concerned by much worse than “unflattering” information).

I don’t disagree that other solutions should be explored to the issue of the persistence or reemergence of old information which data subjects reasonably no longer wish to be known, but people are entitled to use the laws which exist to pursue their aims, and the application by the CJEU of data protection law to the issues pleaded was, to an extent, uncontroversial (is Google a data controller? if it is, what are its obligations to respect a request to desist from processing?)

Cavoukian and Wolf criticise the CJEU for failing to provide sufficient instruction on how “the right to be forgotten” should be applied, and for failing to consider whether “online actors other than search engines have a duty to ‘scrub’ the Internet of unflattering yet truthful facts”, but a court can only consider the issues pleaded before it, and these weren’t. Where I do agree with them is in their criticism of the apparent failure by the CJEU, when giving effect to the privacy rights in Article 8 of the European Convention on Human Rights, and Articles 7 and 8 of the Charter of Fundamental Rights of the European Union, to consider adequately, if at all, the countervailing rights to freedom of expression in Article 10 of the former and Article 11 of the latter. In this respect, the prior Opinion of the Advocate General was perhaps to be preferred.

The key word in my replacement library ananolgy above is “chosen”. Google is not a passive and inert indexing system. Rather, it is a dynamic and commercially-driven system which uses complex algorithms to determine which results appear against which search terms. It already exercises editorial control over results, and will remove some which it is satisfied are clearly unlawful or which constitute civil wrongs such as breach of copyright. Is it so wrong that (if it gives appropriate weight to the (sometimes) competing considerations of privacy and freedom of expression) it should be required to consider a request to remove unfair and outdated private information?

 

 

2 Comments

Filed under Data Protection, Directive 95/46/EC, Europe, human rights, Privacy