Category Archives: Let’s Blame Data Protection

Data protection nonsense on gov.uk

It feels like a while since I randomly picked on some wild online disinformation about data protection, but when you get an itch, you gotta scratch, and this page of government guidance for businesses – “Get your business ready to employ staff: step by step” – specifically on “Personal data an employer can keep about an employee” certainly got me itching. It starts off sensibly enough by saying that

Employers must keep their employees’ personal data safe, secure and up to date.

This is true (Article 5(1)(f) and part of 5(1)(c) UK GDPR). And the page goes on to list some information can be “kept” (for which I charitably read “processed”) without employees’ permission, such as: name, address, date of birth, sex, education and qualifications, work experience, National Insurance number, tax code, emergency contact details, employment history with the organisation, employment terms and conditions, any accidents connected with work, any training taken, any disciplinary action. All pretty inoffensive, although I’m not sure what it’s trying to achieve. But then…oh my. Then, it says

Employers need their employees’ permission to keep certain types of ’sensitive’ data

We could stop there really, and snigger cruelly, Consent (aka “permission”) as a condition for processing personal data is complicated and quite frankly to be avoided if possible. It comes laden with quite strict requirements. The Information Commissioner puts it quite well

Consent is appropriate if you can offer people real choice and control over how you use their data, and want to build their trust and engagement. But if you cannot offer a genuine choice, consent is not appropriate. If you would still process the personal data without consent, asking for consent is misleading and inherently unfair…employers and other organisations in a position of power over individuals should avoid relying on consent unless they are confident they can demonstrate it is freely given

And let’s consider the categories of personal data the government page thinks employers should get “permission” to “keep”: race and ethnicity, religion, political membership or opinions, trade union membership, genetics [sic], biometrics, , health and medical conditions, sexual history or orientation.

But how quickly would an employer’s wheels grind to a halt if it couldn’t process personal data on an employee’s health “without her permission”? It would be unable to refer her to occupational health if she didn’t “permit” it. It would be unable to keep a record of her sickness absence if she withdrew her consent (consent should be as easy to withdraw as it is to give (see Article 7(3)). During the COVID pandemic, it would have been unable to keep a record of whether she had tested positive or not, if she said she didn’t want a record kept.

It’s nonsense, of course. There’s a whole range of gateways, plus a whole Schedule of the Data Protection Act 2018), which provide conditions for processing special categories of data without having to get someone’s consent. They include pressing social imperatives, like compliance with public health law, and promotion of equality of treatment and safeguarding of children or other vulnerable people. The conditions don’t apply across the board, but the point is that employees’ permission – their consent – is rarely, if ever, required when there is another compelling reason for processing their data.

I don’t really understand what need, what gap, the government page is trying to fill, but the guidance is pretty calamitous. And it is only likely to lead to confusion for business owners and employers, and runs the risk of pitting themselves against each other – with disputes arising – amidst the confusion.

BAH!

Now, that felt better. Like I say, sometimes it’s good to scratch that itch.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under consent, Data Protection, Data Protection Act 2018, Let's Blame Data Protection, UK GDPR

Search and (don’t) destroy

Martin Lewis’s Money Saving Expert (MSE) site reports that over £1m is apparently held by Highways England (HE) in respect of Dartford Crossing pre-paid online accounts (Freedom of Information requests were apparently used to establish the amount). It is of course by no means uncommon for money to lie dormant in money accounts – for instance, banks across the world hold fantastic sums which never get claimed. MSE itself suggests elsewhere that the total amount in the UK alone might be around £15bn – but what these FOI requests to HE also revealed is an approach to retention of personal data which may not comply with HE’s legal obligations.

People appear to have received penalty charges after assuming that their pre-paid accounts – in credit when they were last used – would still cover the crossing charge (even where the drivers had been informed that their accounts had been closed for lack of use). MSE reports the case of Richard Riley, who

had been notified by email that his account would be closed, but he’d wrongly assumed it would be reactivated when he next made the crossing (this is only the case if you cross again within 90 days of being notified). On looking into it further, Richard also realised he had £16 in his closed account

However, HE apparently explained to MSE that

…it’s unable to reopen automatically closed accounts or automatically refund account-holders because it has to delete personal data to comply with data protection rules.

This cannot be right. Firstly, as the MSE article goes on to explain, if someone suspects or discovers that they have credit in a closed Dartford Crossing account, they can telephone HE and “any money will be paid back to the debit or credit card which was linked to the account. If this isn’t possible, a refund will be issued by cheque.”

So HE must retain some personal data which enables them to confirm whose money it is that they hold. But if it is true that HE feels that data protection law requires them to delete personal data which would otherwise enable them to refund account-holders when accounts are closed, then I fear that they are misreading two of the key principles of that law.

Article 5(1)(e) of the UK GDPR (the “storage limitation principle”) requires that personal data be “kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed” (emphasis added), and Article 5(1)(c) ( the “data minimisation principle”) requires that personal data be “limited to what is necessary in relation to the purposes for which they are processed” (emphasis added). Both of these make clear that where personal data is still needed for the purposes for which it is processed, then it can (and should) be retained. And when one adds the point, under Article 5(1)(c), that personal data should also be “adequate” for the purposes for which it is processed, it becomes evident that unnecessary deletion of personal data which causes a detriment or damage to the data subject can in itself be an infringement.

This matter is, of course, on a much lower level of seriousness than, for instance, the unnecessary destruction of landing cards of members of the Windrush Generation, or recordings of witnesses in the Ireland Mother and Baby Homes enquiry, but it strikes me that it is – in general – a subject that is crying out for guidance (and where necessary enforcement) by the Information Commissioner. Too many people feel, it seems, that “data protection” means they have to delete, or erase or destroy personal data.

Sometimes, that is the worst thing to do.

The views in this post (and indeed most posts on this blog) are my personal ones, and do not represent the views of any organisation I am involved with.

Leave a comment

Filed under accuracy, adequacy, Data Protection, Information Commissioner, Let's Blame Data Protection, UK GDPR

Virgin on the ridiculous

UPDATE 15.12.14: I think the comments on this piece take it further, and I do accept (as I did at the time, in fact) that the “password” in question was not likely to relate to customers’ accounts.
END UPDATE.

I got into a rather odd exchange over the weekend with the people running the Virgin Media twitter account. It began when, as is my wont, I was searching for tweets about “data protection” and noticed an exchange in which someone had asked Virgin Media whether their sales people rang customers and asked them to give their passwords. Virgin Media kindly appeared to confirm they did, and that

it’s for security as we can’t make any changes without data protection being passed

I asked for clarification, and this exchange ensued

[ME] Is it true your sales people call customers and ask for their account passwords? If so, are these unsolicited calls?

[VM] Yes this is true, our sales team would call and before entering your account, would need you to pass account security. I understand for your own security purposes why you wouldn’t feel great doing this, i’d be the same. If you give us a call on 150/03454541111 we can get this cleared up. Let me know how you get on

[ME] Thanks. Not a customer. Just interested in what seems like questionable practice being defended under guise of data protection

[VM] We contact our customers if there upgrade is due, or for a heath check on accounts, and a few other instances, but I get where your coming from [sic]

There’s nothing unlawful about this practice, and I assume that the accounts in question are service and not financial ones, but it doesn’t accord with normal industry practice. Moreover, one is warned often enough about the risks of phishing calls asking for account passwords. If a legitimate company requires or encourages its sales staff to do this, it adds to a culture of unnecessary risk. There are better ways of verifying identity, as their social media person seems to accept, when they say “I understand for your own security purposes why you wouldn’t feel great doing this, i’d be the same”.

One thing I’m certain about, though, is that isn’t any part of “passing data protection” (unless they mean bypassing) to make outbound calls and ask for customer passwords.

On a final note, and in admiration of bare-faced cheek, I highlight the end of my exchange with Virgin Media

If you want, as your not a customer, you can check out our brill offers here [removed] maybe we could save you a few pounds?

That’s an offer I most certainly can refuse.

(By the way, as it’s an official Virgin Media account, I’ve taken what I was told on Twitter at face value. If I have misunderstood any of their policies on this I’d be happy to correct).

UPDATE:

Virgin Media’s Twitter account appears to have confirmed to me a) that they do ask for customers’ passwords on outbound sales calls, and b) that they see nothing wrong with it. And rather hilariously, they say that “we can discuss further” if I will “pop a few details” on their web form for social media enquiries. No thanks.

12 Comments

Filed under Data Protection, Let's Blame Data Protection, marketing, nuisance calls, PECR, social media

The Moanliness of the Long-distance Runner

Another in the Let’s Blame Data Protection series, in which I waste a lot of energy on something not really worth the effort

The Bournemouth Daily Echo reports that

Hundreds of disgruntled runners who took part in the inaugural Bournemouth Marathon Festival have accused event organisers of withholding information by failing to provide full race results.

and, with rather dull predicability, there’s a familiar apparent culprit

GSi Events Ltd, the team behind the BMF, has published the top ten runners in the various age categories, but is refusing to publish all the results on the grounds of data protection.

But does data protection law really prevent publication of this sort of information? The answer, I think, is “no”, and the reason for this is tied to issues of fairness and consent

The first data protection principle, in Schedule One of the Data Protection Act 1998 (DPA) says that personal data (broadly, information relating to an identifiable individual) must be “processed” (publication is one form of processing) fairly and lawfully.

The concept of fairness is not an easy one to grasp or define, but helpfully the DPA provides a gloss on it, which, to paraphrase, is that if people are properly informed about how their data is going to be processed (who is doing the processing, and for what purpose)  then a key element of “fairness” is met. The Information Commissioner’s Privacy Notices Code of Practice explains

A privacy notice should be genuinely informative. Properly and thoughtfully drawn up, it can make your organisation more transparent and should reassure people that they can trust you with their personal information

The first data protection principle goes on to say that (in particular) personal data shall not be processed at least one of the conditions in Schedule 2 of the Act is met (and Schedule 3, in the case of higher-category sensitive personal data). One of those conditions is

The data subject has given his consent to the processing.

“Consent” is not defined in the DPA, but it is given a definition in the EC Data Protection Directive, to which the DPA gives domestic effect. The Directive says that consent

shall mean any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed

“Specific” and “signifies” are generally taken to mean that implied consent is not valid in this context, (although the practice of implying consent to processing is widespread). Nonetheless, it seems clear that, with a privacy notice, sensibly drafted, the organisers of the Bournemouth Marathon could easily have said to those registering to race “your race result/time will be published, unless you object”. When one looks at the actual privacy notice, however, such a term is absent. 

I suppose that means one could argue that, under the current privacy notice, publishing the race details would be in breach of the DPA. I suppose I could also construct a counter-argument to that to the effect that publication is necessary in pursuance of legitimate interests of the race organisers (for instance to show that it was a real flipping race) when balanced against the legitimate interests of the racers.

But ultimately, come on, it’s just silly to blame data protection: the vast, vast majority of people take part in a marathon knowing that it’s a public event, where they’ll gather plaudits or attract ridicule. Any expectation of privacy of race results is effectively non-existent.

Publish the damn race results, take the infinitesimal risk of someone complaining (a complaint which no one, i.e. the Information Commissioner and the courts, will take seriously or be able to offer a remedy to) and sort your privacy notice out for next year.

Leave a comment

Filed under Data Protection, Let's Blame Data Protection

Let’s Blame Data Protection – the Gove files

Thanks to Tim Turner, for letting me blog about the FOI request he made which gives rise to this piece

On the 12th September the Education Secretary, Michael Gove, in an op-ed piece in the Telegraph, sub-headed “No longer will the quality, policies and location of care homes be kept a secret” said

A year ago, when the first shocking cases of sexual exploitation in Rochdale were prosecuted, we set up expert groups to help us understand what we might do better…Was cost a factor? Did we need to spend more? There was a lack of clarity about costs. And – most worrying of all – there was a lack of the most basic information about where these homes existed, who was responsible for them, and how good they were….To my astonishment, when I tried to find out more, I was met with a wall of silence

And he was in doubt about where the blame lay (no guesses…)

The only responsible body with the information we needed was Ofsted, which registers children’s homes – yet Ofsted was prevented by “data protection” rules, “child protection” concerns and other bewildering regulations from sharing that data with us, or even with the police. Local authorities could only access information via a complex and time-consuming application process – and some simply did not bother…[so] we changed the absurd rules that prevented information being shared

This seemed a bit odd. Why on earth would “data protection” rules prevent disclosure of location, ownership and standards of children’s homes? I could understand that there were potentially child protection concerns in the too-broad-sharing of information about locations (and I don’t find that “bewildering”) but data protection rules, as laid out in the Data Protection Act 1998 (DPA), only apply to information relating to identifiable individuals. This seemd odd, and Tim Turner took it upon himself to delve deeper. He made a freedom of information request to the Department for Education, asking

1) Which ‘absurd’ rules was Mr. Gove referring to in the first
statement?

2) What changes were made that Mr. Gove referred to in the second
statement?

3) Mr Gove referred to ‘Data Protection’ rules. As part of the
process that he is describing, has any problem been identified with
the Data Protection Act?

Fair play to the DfE – they responded within the statutory timescales, explaining

Regulation 7(5) of the Care Standards Act 2000 (Registration) (England) Regulations 2010 …prohibited Ofsted from disclosing parts of its register of children’s homes to any body other than to a local authority where a home is located. Whatever the original intention behind this limitation, it represented a barrier preventing Ofsted from providing information about homes’ locations to local police forces, which have explicit responsibilities for safeguarding all children in their area…we introduced an amendment to Regulation 7 with effect from April 2013

But their response also revealed what had been very obvious all along: this had nothing to do with data protection rules:

the reference to “data protection” rules in Mr Gove’s article involved the Regulations discussed above, made under section 36 of the Care Standards Act 2000. His comments were not intended as a reference to the Data Protection Act 1998

This is disingenuous: “data protection” has a very clear and statutory context, and to extend it to more broadly mean “information sharing” is misleading and pointless. One could perhaps understand it if Gove had said this in an oral interview, but his piece will have been checked carefully before publication, and personally I am in no doubt that blaming data protection has a political dimension. The government is determined, for some right reasons, and some wrong ones, to make the sharing of public sector data more easy, and data protection does, sometimes – and rightly – present an obstacle to this, when the data in question is personal data and the sharing is potentially unfair or unlawful. Anything which associates “data protection” with a risk to child safety, serves to represent it as bureaucratic and dangerous, and serves the government agenda.

And the rather delicious irony of all this – as pointed out on twitter by Rich Greenhill – is that the “absurd rules” (the Care Standards Act 2000 (Registration) (England) Regulations 2010) criticised by Gove were made on 24 August 2010. And the Secretary of State who made these absurd rules was, of course, the Right Honourable Michael Gove MP.

How absurd.

Leave a comment

Filed under Data Protection, data sharing, Freedom of Information, Let's Blame Data Protection, transparency

Let’s blame Data Protection: Part Two

“The leader of the council wishes to make the names of the debtors public, but the Data Protection Act of 1998 prohibits their publication.”

So says an article from the Blackpool Gazette, when quoting a council report (which I haven’t yet been able to find) which appears to have indicated that

The council has been forced to write off £1.68m in owed business rates going back around the last six years

The council leader is reported to have said

Several names appear more than once, owing vast sums of money to the council…Several high-profile business owners, who always seemed to have a lot to say about how the town is run, seem to have no qualms about disappearing owing us tens of thousands of pounds…We are very dogged and tenacious when it comes to pursuing debtors, and clearly need to continue to be.

but

What I do find very frustrating is that I am not able to publish the names of these people

This puzzles me: names of businesses will not, as a general rule constitute personal data under section 1(1) of the Data Protection Act 1998 (DPA). The definition of personal data

data which relate to a living individual who can be identified—
(a) from those data, or
(b)from those data and other information which is in the possession of, or is likely to come into the possession of, the data controller

Even if individuals can be identified from disclosure of the names of defaulting businesses it is perhaps the case that the information will not considered to be personal data, especially following the precedent of the Court of Appeal in Durant where it was held that, for information to be personal data it
should have the putative data subject as its focus rather than…some transaction or event in which he may have figured or have had an interest
It is interesting to note that the Information Commissioner’s Office (ICO), in guidance which appears to have been withdrawn, said
Information about people who run businesses, and the businesses they run, will often be covered by the Act. This is because information about a person’s business, activities, possessions, and so on is generally personal information about that person
although, in a rather circular argument

Business information that does not identify individuals is not covered by the Act

What I think is being got at is that, for example, information consisting of “Richard Hannay is a fifty-year-old black man who runs Imaging Solutions Ltd, which made a £1.2m profit last year” is potentially Richard Hannay’s personal data throughout, whereas “Imaging Solutions Ltd made a £1.2m profit” is unlikely to be Hannay’s personal data when considered in isolation, even though one can easily find out that he is the sole director.
In another, more specific scenario, it might be more easily argued that the names of business are personal data. This is where someone is conducting business as a sole trader. The ICO’s ?withdrawn guidance said

Information about a sole trader’s business will be personal information about him

I’m not sure I would be so unequivocal, but as a general proposition it’s not objectionable.

However, even if business information is personal data, the DPA does not necessarily prevent disclosure of it. In fact, the DPA permits disclosure of any and all types of personal data, as long as it is in compliance with the Act. In short, if disclosure is fair and lawful and relevant provisions permit it, then it will be in compliance with the Act. And, helpfully for the council, there is a specific provision relating to personal data “processed for…the assessment or collection of any tax or duty”. This exemption permits disclosure where not disclosing would be likely to prejudice the collection of the tax in question. Additionally, the sixth condition of Schedule 2 of the DPA provides that, if it  is “necessary for the purposes of legitimate interests pursued by the data controller” personal data may be processed, provided it is not “unwarranted…by reason of prejudice to the rights and freedoms or legitimate interests of the data subject”.
This will not give carte blanche to disclosure of personal data (if personal data it is) of owners of defaulting businesses, but it is certainly arguable in this instance that disclosure would assist the collection of the tax (and, therefore, non-disclosure could prejudice it), and that the balancing exercise required by the sixth Schedule 2 condition would fall in favour of disclosure.
So, a) I doubt that the withheld information is personal data, and, even if it is b) disclosure would be in compliance with the DPA.
One thing is certain, the DPA does not prohibit publication of this information, and, to the extent that it might be engaged, I would not see it as a barrier to disclosure. It might even help the council in its aim to be “dogged and tenacious when it comes to pursuing debtors”.
But it’s so much easier to blame Data Protection.

Leave a comment

Filed under Data Protection, Let's Blame Data Protection

An unshared perspective

Paul Gibbons, FOI Man, has blogged about data-sharing, questioning whether an over-cautious approach to sharing of health data is damaging. Paul says

What I’m increasingly worried about is what appears to be a widely held and instinctive view that any sharing of personal data – and even data that has been anonymised – is necessarily a “bad thing”.

I’ve got to say, in all the time I’ve worked in the field of information rights I’ve never come across anyone who actually thinks that, let alone articulates it (in my experience the only people who say it are those who seek to misrepresent it). The Data Protection Act 1998 (DPA) and EC Directive 95/46/EC to which it gives effect do not act as a default bar to sharing of data. There may be circumstances under which compliance with the law means that sharing of personal data cannot happen, but the converse is true – there will be times when sharing is lawful, necessary and proportionate.

Paul’s prime example of what he sees as (to adopt the title of his piece) “a disproportionate fear of ‘Big Brother’” preventing us from seeing the big picture” is the “predictable outcry” about the care:data programme, whereby the Health and Social Care Information Centre will, through the exercise of certain provisions in the Health and Social Care Act 2012, extract enormous amounts of health and social care information from local systems to centralised ones. The first step in this is the GP Extraction Service (GPES) whereby information relating to medical conditions, treatments and diagnosis, with each patient’s NHS number, date of birth, postcode, gender, ethnicity and other information will be uploaded routinely. The information will then be made available to a range of organisations, sometimes including private companies, sometimes in ostensibly anonymised, sometimes in identifiable, form, for a variety of purposes. This will happen to your medical records unless you opt-out (and if you think you’ve already done so, you probably haven’t – those who objected to the creation of a summary care record will have to go through another opt-out process). And this week we were informed that there will be no national campaign to alert patients to the GPES – the responsibility (and liability) will lie with GP practices themselves. (Anyone wanting to understand this complex and less-than-transparent process must read and follow the superb MedConfidential).

I accept that, on one view, this amassing of health and social care data could be seen as a good thing: as Paul suggests, medical research, for instance is a hugely important area. And the NHS Commissioning Board identifies the following desired outcomes from care:data

– support patients’ choice of service provider and treatment by making comparative data publicly available
– advance customer services, with confidence that services are planned around the patient
– promote greater transparency, for instance in support of local service planning
– improve outcomes, by monitoring against the Outcomes Frameworks
– increase accountability in the health service by making data more widely available
– drive economic growth through the effective use of linked data

But how realistic are these? And what are the attendant risks or detriments? Paul says

central medical records for all NHS patients…would mean that when you turned up at a hospital far from home, as I have done myself, doctors would have access to your medical records and history. Believe me, when you are in pain and desperate to be treated, the last thing that you want to do is to answer questions about your medical history

With great respect, the ideal of a centralised system whereby medics can provide emergency treatment to patients by accessing electronic records is never going to be more than a myth. Put another way – would Paul be happy trusting his life to the accuracy of an electronic record that might or might not say, for instance, whether he is allergic to aspirin? Treatment of patients is a matter of diagnosis, and emergency diagnoses will never be made solely, if at all, on the basis of records.

Security of information, and risks of identification of individuals are other key concerns. Paul says Daniel Barth-Jones identifies “deficiencies in [reidentification] studies” but I think what Barth-Jones is actually arguing is that the risks of reidentification are real, but they must be accurately reported and balanced against the likelihood of their happening.

But ultimately I have two major conceptual concerns about care:data and what it implies. The first is that, yes, I am instinctively distrusting of agglomeration of sensitive personal data in identifiable form in mass processing systems: history has taught us to be this way so I don’t see this, as Paul appears to, as a “fashionable” mistrust (and, for instance, the Joseph Rowntree Foundations’ exemplary Database State report is now over six years old). The second is that patient-medic confidentiality exists, and has existed for a very long time, for a reason: if patients are not certain that their intimate medical details are confidential, they might be reluctant to speak candidly to their doctor. In fact, they might not even visit their doctor at all.

3 Comments

Filed under Confidentiality, Data Protection, data sharing, human rights, Let's Blame Data Protection

Let’s blame Data Protection (a new series): Part One

Data Protection is to blame for many things (sleepness nights for Data Protection officers, hits to the public purse,  a proportionate measure of respect and security for people’s sensitive private information, bulging wallets for lawyers) and many people like to criticise it. In this occasional series I want to come to its defence, by pointing out examples where data protection has been wrongly blamed for a failure elsewhere. The Information Commissioner used to do something similar but seems to have given up with that (and, after all, “data protection duck out” is a cringemaking phrase).

So here’s my first example: “Vague” Data Protection Act blights fraud detection, say insurers

The facts of the article itself are fine, as one would expect if the author is Pete Swabey, but it’s the message itself that grates. According to the Chartered Insurance Institute (CII), there is a problem with section 29 of the Data Protection Act 1998 (DPA), which permits the disclosure of personal data by a data controller, whereby the general presumption against non-disclosure is disapplied if applying it would be likely to prejudice any of the following purposes: the prevention or detection of crime, the apprehension or prosecution of offenders, or the assessment or collection of any tax or duty or of any imposition of a similar nature. Normally the question whether to disclose will arise in response to a specific request from another person or body (normally one with crime detection or prosection powers, or tax collection powers). This comes down to a matter of applying a balancing test to specific facts: if I don’t disclose this information, would it be likely to cause prejudice to those purposes?

This is often a difficult decision for a data controller (it’s about serious matters – why should it always be easy?). But the CII complain that

the vagueness of Section 29…has led to an extremely high volume of information requests, with little consistency or clarity. This, it says, is hindering investigations. 

“Certain companies, particularly the lawyers, are sending requests out without thinking about them,” [says] David Clements, motor investigations manager at Zurich

Bad Data Protection Act! Making people ask for disclosure of personal data without giving it much thought!

Also, the fact that requests and responses are made in a haphazard, non-standard fashion creates unnecessary work for fraud investigators.

Silly Data Protection Act! Making an industry incapable of standardizing procedures!

And, indeed, the article says that the industry is trying to sort itself out

The New Generation Claims Board is working on a voluntary code of best practice to help insurance providers both improve the efficacy of their fraud investigations and reduce their risk of non-compliance. 

“We’re going to provide the industry with a best practice protocol plus a template for sending and receiving requests,” Clements explains.

But the evil Data Protection Act is still lurking about causing trouble, because this is only a voluntary scheme

as insurance companies are not even obliged to respond to Section 29(3) requests

Come on Data Protection Act, sort yourself out!

Leave a comment

Filed under Data Protection, Information Commissioner, Let's Blame Data Protection, Uncategorized