Court Slams Data Protection Commission for Siding Against Worker Unlawfully Spied Upon by CCTV

Ireland’s Data Protection Commission (DPC) probably has the worst reputation of all the data protection authorities in the world. They have tried to force complainants to sign non-disclosure agreements because the DPC does not want the public learning about investigations into abuses of personal data. A 2021 study by the Irish Council of Civil Liberties found the DPC had resolved only 2 percent of major GDPR cases. And even when the DPC issues a fine for a serious data protection violation it may be revised upwards because other EU regulators agree to overrule the DPC. So it comes as little surprise that earlier this week the Irish Court of Appeal upheld a decision that effectively said the DPC did not apply the law correctly when they refused to assist an employee who was spied upon by CCTV cameras at work. Instead of simply acknowledging their original mistake, the DPC appealed a High Court decision in a vain attempt to spare their blushes, leading the Appeals Court judge to comment:

…the costs involved in all these appeals are very substantial and entirely disproportionate to the issue concerned…

The case revolved around the use of CCTV to monitor the movements of Cormac Doolin whilst he was employed by Our Lady’s Hospice and Care Service (OLHCS) in Dublin. The hospice’s policy on CCTV stated:

The purpose of the system is to prevent crime and promote staff security and public safety.

A sign was placed beside each camera, stating:

Images are recorded for the purposes of health and safety and crime prevention.

In 2015, not long after the Bataclan terrorist attack in Paris, graffiti was carved into a table in a staff tea room which read: “Kill all whites, ISIS is my life”. There is no suggestion that Doolin was responsible for this graffiti, but an investigation that sought to determine who was responsible led OLHCS managers to separately conclude Doolin had entered the tea room for unauthorized breaks. Doolin was disciplined, so he complained to the DPC that data obtained from the hospice’s CCTV cameras had been used unlawfully. Doolin appealed after the DPC rejected his complaint. Both the High Court and Court of Appeal agreed with Doolin on the essentials of his argument. Whilst it was a coincidence that the OLHCS had spotted Doolin’s unauthorized breaks during a legitimate investigation of a potentially serious security risk, that did not give them the right to use that data as they did. OLHCS only obtained legal consent to use the data for the reasons which had been publicly stated, and could not also use it to monitor the movements of staff for reasons unrelated to security.

This analysis of the law should have been straightforward. However, the DPC somehow found ways to torture their interpretation of the law and the facts of this case so they could repeatedly side with the hospice and against Doolin. DPC claimed that there was only one instance of processing the data, that the instance was legal because it related to a security investigation, and hence OLHCS was entitled to use the data as they did. During later legal proceedings the DPC effectively changed their argument by implausibly claiming that unauthorized staff breaks also represent a security risk. Arguments made by John O’Dwyer, Deputy Commissioner in the Office of the DPC, were singled out for criticism by the Appeals Court:

…while the graffiti incident was mentioned, the primary focus of the [disciplinary] meeting appears to have revolved around the taking of unauthorised breaks by Mr. Doolin. How this is said to be a security issue or related to a security issue is not explained by Mr. O’Dwyer, nor has it ever been explained by OLHCS.

The court’s decision went on to observe how O’Dwyer had tried to justify the faulty decision-making of the DPC by bolstering his security-oriented arguments at a later stage in the proceedings.

Mr. O’Dwyer swore a third affidavit on the 21st March, 2019. In paragraph 5 of this affidavit, Mr. O’Dwyer avers as follows:

“On the basis of the evidence before the court, I say that it is clear beyond doubt that the processing of the CCTV footage by OLHCS was for security purposes, arising directly from and relating to the investigation of the graffiti incident. It is clear that, in the particular circumstances of this case, the taking of unauthorised breaks at an unauthorised location, the site of the graffiti incident, was a serious and bona fide security issue and that the investigation by OLHCS, and the disciplinary action which resulted therefrom, arose directly out of and was directly connected to this security issue, albeit that the sanction applied in the context of the disciplinary action relied on admissions made by the applicant himself.”

As is subsequently pointed out in the judgment of the High Court, this averment by Mr. O’Dwyer is, to say the least, somewhat surprising. He appears to go considerably further than in his previous affidavits, and indeed than Ms. Pierce, the notice party’s own Data Protection Officer, in suggesting that the taking of unauthorised breaks at an unauthorised location was a serious and bona fide security issue. What is more, Mr. O’Dwyer feels able to express this conclusion “on the basis of the evidence before the Court”. However, as in his previous affidavits, I cannot see any justification for this statement that is to be found in the evidence before the Circuit Court. A similar conclusion was reached by the High Court as will become apparent.

The Appeals Court decision reiterated the reasons why the High Court rejected the DPC’s argument.

The Court said (at para. 42):

“The DPC has gone from finding no breach because there was no further processing of the CCTV footage to asserting in these proceedings no breach because any further processing was done for the purpose for which the material was collected i.e. security.”

She described this new argument as “remarkable” because there was no evidence at all to support that argument…

Instead of just upholding the privacy rights of a hospice employee, the DPC entangled itself in convoluted thinking about whether Doolin’s personal data had been processed one or more times. When they said it was only processed once, they argued the purpose was security, with the implication that it did not matter if the outcome for Doolin had nothing to do with improving security within the hospice. When the DPC recognized the courts would conclude the data had be processed more than once, the DPC then suggested that every instance of processing was motivated by the need for security. The Court of Appeals dissected the DPC’s broken logic whilst also showing how it was inconsistent with the GDPR advice given by an influential group of EU data protection regulators. The judge found Doolin’s data had, in fact, been processed several times. This is because filming somebody is not the same as watching the film at a later stage, and both are distinct from the managers tabulating Doolin’s movements to calculate when he entered and left the tea room. But this is just a prelude to what the judge described as “the critical error in the DPC’s Decision”, which saw the DPC conclude the data was processed for security reasons when it was “manifestly for a different purpose”.

Data protection laws have been passed in many countries. The public are always told these laws are meant to protect ordinary people from potential abuses of data collected about them. So why would agencies like the DPC decide not to help an individual who has made a strong and straightforward complaint about the unlawful use of data collected about him? Why would the DPC then choose to pour resources into a pointless legal battle where they presented inconsistent explanations of their initial decision instead of admitting to mistakes that impartial lawyers had no difficulty in identifying? I believe the problem is two-fold:

  • When you look past the grand promises made by politicians, bodies like the DPC have nowhere near the resources required to review all the infringements of data protection law that a well-informed public can present to them. Ambitious career civil servants might have the confidence to argue for a doubling or even a trebling of budgets, but none would benefit by admitting they really need a ten-fold or hundred-fold increase in resources just to keep the promises that politicians have already made. So it is in their interests to find quick excuses to disregard many legitimate complaints about data protection violations in order to manage workloads and focus resources on a smaller number of high-profile cases that attract the attention of politicians and journalists.
  • However, once you create an incentive to save resources by rejecting legitimate complaints, you also create an incentive to waste resources by defending bad decisions. If data protection authorities treated cases like Doolin’s with the respect they deserve they may soon be overwhelmed by all the other complaints which are no longer dismissed prematurely. Rather than admitting you are failing – as suggested by the DPC only resolving 2 percent of those complaints which are so important that they have EU-wide significance – it is tempting to try to save face by grinding down the individual who lodged the original complaint. Over six years have passed since Doolin’s data was abused, but the DPC still persisted with arguing that his privacy had not been violated. There will not be many instances where an individual’s privacy complaint can be pursued at such length despite the bitter opposition of the taxpayer-funded organization that is supposed to protect his privacy rights.

The press and politicians like to talk about privacy violations by big corporations for understandable reasons, but the goals of data protection law should not be limited to only addressing violations that affect millions of people at a time. An individual’s life can be turned upside down by the abuse of their data, and such abuses might be the fault of another individual, or a small business, or even a hospice. The law is not fit for purpose if it is only enforced in situations where a data protection agency can levy enormous fines from a multinational business. Priorities should be set according to the harm to the victim, and not per the size of the penalty that might be collected from the transgressor. I suspect the majority of data protection agencies see the world the other way around. They are run by career civil servants, many of whom are lawyers, and few of whom have any technological expertise, so they know their success will be measured by how they handle attention-grabbing interactions with big business and not by how they protect specific individuals like Doolin.

It appears to me that the enforcement of privacy laws are in a deeply unhealthy state worldwide. Promises keep being made to the public, but there is little interest in going back to check if they were kept in practice. The people trusted to enforce the law are often motivated to suppress the law to make themselves appear more effective than they are. They are like a police force which claims to solve almost every murder, but is based in a city where an extraordinarily high number of people are recorded as missing without explanation. Any enforcement body which is trusted to measure its own performance will be able to find ways to manipulate results so it seems like they are effective when they are desperately failing. The easiest way to do that is to simply refuse to count many instances of illegality by pretending no law has been broken.

Our societies need to acknowledge the essential problems with data protection enforcement now, before we build yet more rules, objectives and targets upon an unsafe foundation. For example, the latest fashion amongst legislators is to worry that combining CCTV with facial recognition technology may lead to unfair outcomes for black people. I believe this concern has merit but is given too much prominence relative to other issues that must be dealt with first. Facial recognition technology can be flawed, but it is first important to recognize all the simpler ways that data can be collected and used in ways that are unfair to minorities. For example, US law enforcement is known to have prosecuted many criminals based on data that was originally collected to prevent terrorism. The subjects of these prosecutions were more likely to be black than white.

US law enforcement engaged in the same essential conjuring trick as the Dublin hospice mentioned above, but they did it on a much larger scale. They said data was needed for one purpose, in the expectation that the community would agree to its being used for that purpose, but then used it for a different purpose. Whilst some people might also agree with its use for that second purpose, the community as a whole was not given opportunity to withhold their consent. Until we fix fundamentals like these, so the use of data is always consistent with the limits agreed with the community, then individuals and minorities will always be at greater risk because they do not have equal resources to fight the institutions that abuse them.

You can read the Court of Appeal decision in Doolin vs The Data Protection Commissioner by clicking here.

Eric Priezkalns
Eric Priezkalnshttp://revenueprotect.com

Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), an association of professionals working in risk management and business assurance for communications providers. RAG was founded in 2003 and Eric was appointed CEO in 2016.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press.

Related Articles

Get Our Weekly Newsletter by Email