Tag Archives: data protection

Covert recordings in family law proceedings – some slightly flawed guidance

The issue of the legality of the making of, and subsequent use of, covert audio and/or visual recordings of individuals is a complex one – even more so when it comes to whether such recordings can be adduced as evidence in court proceedings.

I’m not going to try to give an answer here, but what I will do is note that the Family Justice Council has recently produced guidance on cover recordings in family law proceedings concerning children, and it contains some rather surprising sections dealing with data protection law.

Firstly, I should say what it gets right: I think it is correct when it indicates that processing consisting of the taking of and use of covert recordings for the purpose of proceedings will not normally be able to avail itself of the carve-out from the statutory scheme under Article 2(2)(a) UK GDPR (for purely personal or household purposes).

However, throughout, when addressing the issue of the processing of children’s data, it refers to the Information Commissioner’s Office’s Children’s Code, but doesn’t note (or notice?) that that Code is drafted specifically to guide online services on the subject of age appropriate design of such services. Although some of its general comments about children’s data protection rights will carry over to other circumstances, the Children’s Code is not directly relevant to the FJC’s topic.

It also goes into some detail about the need for an Article 6(1) UK GDPR lawful basis if footage is shared with another person. Although strictly true, this is hardly the most pressing point (there are a few potential bases available, or exemptions to the need to identify one). But it also goes on to say that a failure to identify a lawful basis will be a “breach of the DPA 2018” (as well as the UK GDPR): I would like its authors to say what specific provisions of the DPA it would breach (hint: none).

It further, and incorrectly, suggests that a person making a covert recording might commit the offence of unlawfully obtaining personal data at section 170 DPA 2018. However, it fails to recognise that the offence only occurs where the obtaining is done without the consent of the controller, and, here, the person making and using the recording will be the controller (as the “lawful basis” stuff above indicates).

Finally, when it deals with developing policies for overt recording, it suggests that consent of all the parties would be the appropriate basis, but gives no analysis of how that might be problematic in the context of contentious and fraught family law proceedings.

The data protection aspects of the guidance are only one small part of it, and it may be that it is otherwise sound and helpful. However, it says that the ICO were consulted during its drafting, and gave “helpful advice”. Did the ICO see the final version?

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Covert recording, Data Protection, Data Protection Act 2018, Family law, Information Commissioner, UK GDPR

Could the Data (Use and Access) Bill fall?

[EDIT: in this post I originally said I understood that the current parliamentary session would end when Parliament rises for summer recess. Prompted by Andrew Harvey, on the Jiscmail Data Protection list, I checked this point, and I was wrong: my MP (who, on the two occasions I’ve emailed him, has been impressively responsive), says “With the legislative programme from the King’s Speech barely a quarter of the way through, I would guess this will be at least an 18 month session”). So one of the pressing issues in the post is less pressing, but that still doesn’t get round the issue of the impasse.]

Westminster is at an impasse over the Data (Use and Access) Bill. The Lords have repeatedly introduced amendments, in the form of totally new clauses on AI and copyright which were never intended to be part of the Bill, and the Commons have repeatedly removed them. Yesterday’s reprise of the exercise suggests that ping pong is not stopping any time soon.

This must be of tremendous frustration to the government. In particular, it will be of significant concern to the ministers and civil servants who will be negotiating with the European Commission over the reciprocal data adequacy arrangements which allow free transfer of personal data between the EU and the UK. The Commission had introduced a sunset clause to the original agreement, which was due to expire this month, but this has been extended for a further six months, specially to allow for the passage and enactment of the DUAB (the Commission wants to see what the revised UK data protection scheme will look like).

So what happens now? As the Bill was introduced in the Lords, the Commons cannot invoke its powers to force the Bill through to Royal Assent, under section 2 of the Parliament Act 2011.

The current parliamentary session may well run on for some time yet. Traditionally, all parliamentary business would cease at prorogation, so if a Bill hadn’t passed, it fell. In recent years, however, procedures in both Houses have been developed, whereby, by agreement, a Bill can “carry over” to the next session. This is very unusual, though, with a Bill introduced in the Lords. It is also difficult to see how, or why, there would be agreement to carry over a Bill like the DUAB, over which the two Houses are in actual disagreement.

Maybe the alternative would be to allow the Bill to fall (or withdraw it), and reintroduce it in the Commons, in the next session.

But there would be no winners in such a scenario. The government (and Parliament) would have to go to significant time and cost, and the opponents in the Lords, serried behind Baroness Kidron, would be no closer to getting the artists’ protections from AI models that they seek.

And in the meantime, the extended sunset clause for UK adequacy would be dropping below the horizon.

Is there still time for compromise? The simple answer is yes, but there have been few signs of much movement from either side.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under adequacy, Data (Use and Access) Bill, parliament

The Emperor has no clothes!

[reposted from my LinkedIn account]

When a public authority receives a Freedom of Information Act request and the requested information contains personal data (of someone other than the requester) it must first consider whether it can even confirm or deny that the information is held. For instance “Dear NHS Hospital Trust – please say whether you hold a list of embarrassing ailments suffered by Jon Baines, and if you do, disclose the list to me”. To confirm (or deny) even holding the information would tell the requester something private about me, and would contravene the data protection principles at Article 5(1) of the UK GDPR. Therefore, the exemption at s40 of FOIA kicks in – specifically, the exemption at s40(5A): the hospital can refuse to confirm or deny whether the information is held.

But suppose that, mistakenly, the hospital had perhaps confirmed it held the information, but refused to disclose it? The cork, surely, is for ever out of the bottle.

Upon appeal by the requester (this requester really has it in for me) to the ICO, I could understand the latter saying that the hospital should have applied s40(5A) and failure to do so was a failure to comply with FOIA. However, certainly of late, the ICO has engaged in what to me is a strange fiction: it says in these circumstances that it will “retrospectively apply s40(5A)” itself. It will pretend to put the cork back in the bottle, after the wine has been consumed.

And now, the Information Tribunal has upheld an ICO decision to do so, albeit with no argument or analysis as to whether it’s the correct approach. But even more bizarre it says

We are satisfied that the Commissioner was correct to apply section 40(5B) FOIA proactively, notwithstanding the information that has previously been provided by the Trust, to prevent the Trust from providing confirmation or denial that the information is held.

But the Trust had already done so! It can’t retrospectively be prevented from doing something it has already done. The cork is out, the wine all gone.

Am I missing something? Please excuse the sudden mix of metaphor, but can no one else see that the Emperor has no clothes?

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

13 Comments

Filed under Data Protection, FOIA, Freedom of Information, Information Commissioner, UK GDPR

Personal use of work devices – an Irish judgment

A frequent headache for data protection practitioners and lawyers is how to separate (conceptually and actually) professional and personal information on work devices and accounts. It is a rare employer (and an even rarer employee) who doesn’t encounter a mix of the two categories.

But, if I use, say, my work phone to send a couple of text messages (as I did on Saturday after the stupid SIM in my personal phone decided to stop working), who is the controller of the personal data involved in that activity? I’d be minded to say that I am, (and that my employer becomes, at most, a processor).

That is also the view taken by the High Court in Ireland, in an interesting recent judgment.

The applicant was an employee of the Health Service Executive (HSE), and did not, in this case, have authority or permission to use his work phone for personal use. He nonetheless did so, and then claimed that a major data breach in 2021 at the HSE led to his personal email account and a cryptocurrency account being hacked, with a resultant loss of €1400. He complained to the Irish Data Protection Commissioner, who said that as his personal use was not authorised, the HSE was not the controller in respect of the personal data at issue.

The applicant sought judicial review of the DPC decision. This of course meant the application would only succeed if it met the high bar of showing that the DPC had acted unlawfully or irrationally. That bar was not met, with the judge holding that:

The DPC did not purport to adopt an unorthodox interpretation of the definition of data controller. Instead, against the backdrop of the factual matrix before it, it found that the HSE had not “determined the purposes and means 28of the processing” of the data relating to the Gmail, Yahoo, Fitbit and Binance accounts accessed by the applicant on his work phone. That finding appears to me to be self-evident, where that use of the phone clearly was not authorised by the HSE.

I think that has to be correct. But I’m not sure I quite accept the full premise, because I think that even if the HSE had authorised personal use, the legal position would be the same (although possibly not quite as unequivocally so).

In genuinely interested in others’ thoughts though.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under controller, Data Protection, employment, GDPR, Ireland, judgments, Uncategorized

Subject access, Leeds United, and ****

[reposted from my LinkedIn account]

You’d have thought most football fans would be keen to prove they’d not attended a Leeds United match [#bantz], but when Melvyn Flower was told by the club he couldn’t renew his season ticket for next season, because he’d not used his current one often enough, he resorted to data protection law to vindicate his support for the club.

The information disclosed to him showed that he attended matches on all the occasions the club had said he hadn’t.

I don’t quite understand how the club searched for and disclosed his personal data, without (when doing so) realising its mistake (maybe he asked for footage from a specific camera near his reserved seat). But in any case, it’s a nice little story, and topped off with an excellent point from Mr Flower:

Why would I buy a season ticket and not go this season, of all seasons, given the **** I’ve sat through since 1978?

1 Comment

Filed under Data Protection, not-entirely-serious, Sport, subject access

Retaining data for journalistic purposes?

This is a quite extraordinary data protection story, by Jamie Roberton and Amelia Jenne of Channel 4 News , involving a mother of a woman who died in suspicious circumstances.

It appears that a “Victims’ Right to Review” exercise was undertaken by Gloucestershire Police, at the request of the family of Danielle Charters-Christie, who was found dead inside the caravan that she shared with her partner – who had been accused of domestic abuse – in Gloucestershire on 26 February 2021.

Officers then physically handed a 74-page document to Danielle’s mother, and the contents of it were subsequently reported by Channel 4 News. But, now, the police say that the Review report was “inadvertently released”, are demanding that Danielle’s mother destroy it, and have referred her apparent refusal to do so to the Information Commissioner’s Office as a potential offence under s170(3) of the Data Protection Act 2018.

That provision creates an offence of “knowingly,…after obtaining personal data, [retaining] it without the consent of the person who was the controller in relation to the personal data when it was obtained”.

But here’s a thing: it is a defence, under s170(3)(c) for a person charged with the offence to show that they acted (and here, the retention of the data would be the “action”) for the purposes of journalism, with a view to the publication by a person of any journalistic material, and in the reasonable belief that in the particular circumstances the retaining was justified as being in the public interest.

The ICO is tasked as a prosecutor for various data protection offences, including the one at s170 DPA. No doubt whoever at the ICO is handed this file will be having close regard to whether this statutory defence would apply, but will also, in line with the ICO’s duty as a prosecutor, to consider evidential factors, but also whether a prosecution would be in the public interest.

At the same time, of course, the ICO has civil enforcement powers, and might well be considering what were the circumstances under which the police, as a controller, wrongly disclosed personal data in such apparently serious circumstances.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection Act 2018, Information Commissioner, law enforcement, offences, police

Machine learning lawful basis on a case-by-case approach – really?

The Information Commissioner’s Office has published its response to the government’s consultation on Copyright and AI. There’s an interesting example in it of a “oh really?!” statement.

The government proposes that, when it comes to text and data-mining (TDM) of datasets that contain copyright works) a broad exception to copyright protection should apply, under which “AI developers would be able to train on material to which they have lawful access, but only to the extent that right holders had not expressly reserved their rights”. Effectively, rights holders would have to opt out of “allowing” their works to be mined.

This is highly controversial, and may be the reason that the Data (Use and Access) Bill has stalled slightly in its passage through Parliament. When the Bill was in the Lords, Baroness Kidron successfully introduced a number of amendments in relation to use of copyright info for training AI models, saying that she feared that the government’s proposals in its consultation “would transfer [rights holders’] hard-earned property from them to another sector without compensation, and with it their possibility of a creative life, or a creative life for the next generation”. Although the government managed to get the Baroness’s amendments removed in Commons’ committee stage, the debate rumbles on.

The ICO’s response to the consultation notes the government’s preferred option of a broad TDM exception, with opt-out, but says that, where personal data is contained in the training data, such an exception would not “in and of itself constitute a determination of the lawful basis for any personal data processing that may be involved under data protection law”. This must be correct: an Article 6(1) UK GDPR lawful basis will still be required. But it goes on to say “the lawfulness of processing would need to be evaluated on a case-by-case basis”. A straightforward reading of this is that for each instance of personal data processing when training a model on a dataset, a developer would have to identify a lawful basis. But this, inevitably, would negate the whole purpose of using machine learning on the data. What I imagine the ICO intended to mean was that a developer should identify a broad, general lawful basis for each dataset. But a) I don’t think that’s what the words used mean, and b) I struggle to reconcile that approach with the fact that a developer is very unlikely to know exactly what personal data is in a training dataset, before undertaking TDM – so how can they properly identify a lawful basis?

I should stress that these are complex and pressing issues. I don’t have answers. But opponents of the consultation will be likely to jump on anything they can.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

1 Comment

Filed under AI, Data Protection, datasets, DUAB, Information Commissioner, Lawful basis, parliament, Uncategorized

The legality of data processing in the course of litigation

There is very convoluted litigation taking place which has as its focus a witness statement, prepared by a solicitor acting for a number of insurance companies who are defending personal injury claims arising from road traffic accidents (RTAs). And part of the argument (and a satellite claim) has now become about compliance with data protection law.

Five original claims were made for damages arising from RTAs. The defendant insurance companies were represented by law firm DWF, and one of DWF’s solicitors prepared a witness statement which contained an analysis of claims data collected by DWF in relation to a number of claims submitted by claimants represented by the solicitors who acted on behalf of the five claimants. The statement sought to adduce that in an unusually high number of the claims claimants had been referred for further psychological assessment, by a doctor who in 100% of those cases diagnosed a psychiatric condition and in two thirds of those cases said that the recovery period would be over two years. In short, a large number of claimants in the relevant RTAs appeared to develop long-term psychiatric conditions.

The claimant sought unsuccessfully to debar the witness statement, although the judge (on appeal) noted that it would be “for the Judge at trial to make of this evidence what they will [although] there are questions as to the extent to which this evidence assists without more in proving fundamental dishonesty”.

Notwithstanding this, an initial 317 (now reduced to three) claims were then made by people whose personal data was accepted to have been processed by DWF for the purposes of preparing the witness statement above. The claims here are for various breaches of the UK GDPR (such as excessive processing, and lack of fairness, lawful basis and transparency).

In a judgment handed down on 1 April, on an application by the claimants for specific disclosure in the UK GDPR claim (and an application by the defendant to amend its defence and strike out a witness statement of the claimants’ solicitor) Mrs Justice Eady DBE dismissed the disclosure applications (made under various headings), on the basis that much of the information would clearly be privileged material, or not relevant, or that the application was a fishing expedition.

If this gets to trial it will be interesting though. This sort of processing of personal data takes place in the course of (non-data-protection) private litigation routinely. It is generally not assumed that any issues of illegality arise. Any ultimate findings would be notable for litigators, and those who need to advise them on data protection compliance.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, judgments, litigation, UK GDPR

A new data protection duty?

I’ve been looking in more detail at the recent subject access judgment in Ashley v HMRC. One key point of general application stands out for me, and that is that it states that in some cases (i.e. where it is necessary for intelligibility purposes) a controller has a duty to provide contextual information in addition to copies of personal data.

As the judge put it

Article 15(1) and 15(3), read with Article 12(1) and (2) of the UK GDPR, did require the Defendant to go beyond providing a copy of the Claimant’s personal data where contextual information was necessary for that personal data to be intelligible in the sense of enabling the data subject to exercise their rights conferred by the UK GDPR effectively. It follows that insofar as the Defendant did not adopt this approach, it was in breach of this duty.

And although she couched the following as “guidance” for the HMRC when reconsidering the request, I feel it has general application:

…it is unlikely that providing an extract that simply comprises the Claimant’s name or his initials or other entirely decontextualised personal data of that sort, will amount to compliance with this obligation.

In arriving at this conclusion the judge drew in part on both pre- and post-Brexit case law of the Court of Justice of the European Union. Most notably she decided to have regard to case C-487/21. Even though this does not bind the domestic courts, the effect of section 6(2) of European Union (Withdrawal) Act 2018 is that courts may have regard to EU case law where it is relevant to the matter before them.

Of course, there are also times when merely providing a snippet in the form of a name constitutes a failure to provide all of the personal data in scope (omitting the final five words of “Jon Baines works at Mishcon de Reya” would be to omit some of my personal data). But the “context duty” seems to me to go further, and creates, where it is necessary, an obligation to provide information beyond what is in the source documents.

Most of the other points in the judgment, as important as they were to the facts, and as interesting they are, particularly on the concept of “relating to” in the definition of “personal data”, will not necessarily change things for most data subjects and controllers.

But this “context duty” feels to me to be an advancement of the law. And I suspect controllers can now expect to see data subjects and their lawyers, when making subject access requests (or when challenging responses), begin to argue that the “context duty” applies.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, GDPR, judgments, subject access, UK GDPR

O’Carroll v Meta – what now for targeted adverts on Facebook

Following the news that claimant Tanya O’Carroll and defendant Meta have settled ahead of what was likely to be a landmark data protection case, what are the implications?

Ms O’Carroll argued that advertising served to her on Facebook, because it was targeted at her, met the definition of “direct marketing” under section 122(5) of the Data Protection Act 2018 (“the communication (by whatever means) of advertising or marketing material which is directed to particular individuals”) and thus the processing of her personal data for the purposes of serving that direct marketing was subject to the absolute right to object under Article 21(2) and (3) UK GDPR.

Meta had disputed that the advertising was direct marketing.

The “mutually agreed statement” from Ms O’Carroll says “In agreeing to conclude the case, Meta Platforms, Inc. has agreed that it will not display any direct marketing ads to me on Facebook, will not process my data for direct marketing purposes and will not undertake such processing (including any profiling) to the extent it is related to such direct marketing”.

One concludes from this that Meta will, at least insofar as the UK GDPR applies to its processing, now comply with any Article 21(2) objection, and, indeed, that is how it is being reported.

But will the upshot of this be that Meta will introduce ad-free services in the UK, but for a charge (because its advertising revenues will be likely to drop if people object to targeted ads)? It is indicating so, with a statement saying “Facebook and Instagram cost a significant amount of money to build and maintain, and these services are free for British consumers because of personalised advertising. Like many internet services, we are exploring the option of offering people based in the UK a subscription and will share further information in due course”.

The ICO intervened in the case, and have uploaded a summary of their arguments, which were supportive of Ms O’Carroll’s case, and her lawyers AWO Agency have also posted an article on the news.

The views in this post (and indeed most posts on blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under Data Protection, Data Protection Act 2018, facebook, Information Commissioner, marketing, Meta, Right to object, UK GDPR