The riots in Dublin city centre on November 23 precipitated a heavy fallout on multiple fronts, from the social to the legal to the political.
The most immediate political effect was the escalation of the pressure on the shoulders of Minister for Justice Helen McEntee from mild to extreme in the space of scarcely 10 hours.
Ms McEntee, who has held her portfolio since June of 2020, had a relatively charmed ministerial life up until last summer when a spate of assaults on foreign tourists in Dublin led to accusations that the city centre is an unsafe place to visit.
The minister has always maintained this is not the case, and continued to do so in the wake of the miserable scenes that befell Dublin in the aftermath of the stabbing of three children and their carer outside Gaelscoil Cholaiste Mhuire on that fateful afternoon.
Nevertheless, the minister is enough of a realist to know that her job, and likely her future political ambitions, hinge on what happens next.
In such a scenario, the political playbook is generally to find a cause — and that’s what brings us to facial recognition technology (FRT)and its use by gardaí. The minister has now made clear this is an immediate legislative priority for her.
On the face of it, the logic of that decision would seem sound — An Garda Síochána is currently trawling through some 6,000 hours’ worth of CCTV footage of Dublin from the night in question, when hundreds of masked rioters rampaged along O’Connell Street, leaving burning buses and looted shops in their wake.
The minister argues that anything that can make the onerous task of parsing that footage simpler should be utilised.
This perhaps ignores the fact that most of the night’s raiders had gone to some trouble to cover their faces, which isn’t the only practical problem facing FRT in an Irish context.
Regardless, Helen McEntee has made clear that her primary approach to staunching the current political wound she’s carrying is to reignite a battle she first kickstarted in the summer of 2022, when she told the Garda Representative Association (GRA) that she would be introducing FRT in the future.
Details at the time were sketchy. In the heel of the hunt it emerged that the department’s plan was unworkable, given the legislative plan to bring FRT to fruition was to append it to the then Garda Recording Devices bill — the law, now passed, which will see bodycams included as standard on Garda uniforms from early next year.
That was always going to be a difficult sell given the bill in question had already passed through pre-legislative scrutiny.
In the end, the Green Party blocked the action (TD Patrick Costello described FRT at the time as “riddled with racial bias and privacy issues”) and the technology in an Irish context stalled.
And there it might have rested had not rioters brought chaos to Dublin city centre.
In the aftermath, Ms McEntee made clear that specific, standalone legislation to enable FRT for gardaí is expected to be in place by early next year.
Again, few specifics were given other than that the technology would be used to aid in to investigate incidents like the unrest in Dublin.
To achieve this goal, Fine Gael is going to have to go toe to toe with the Greens once more.
This time, however, it appears it is up for the fight, the alternative of a resigning justice minister and the chaos/election that could bring probably not being especially palatable.
Different forms of FRT have been around for much of the last decade, but fundamentally the technology involves a software programme comparing images, trying to establish if any faces are present, and then trying to get a close match with an existing image or gallery of images.
There are three basic applications of FRT — face detection, face classification (gender, age, skin tone etc), and facial recognition/verification.
Facial recognition is an example of a biometric technology, one dependent upon a person’s distinguishing physical characteristics.
Biometrics have proven troublesome in the past in terms of state-sponsored initiatives as observers of the Public Services Card saga will attest, given such information is protected as special category data under the GDPR, meaning the processing of same is prohibited save for certain exceptions, and it has to be legislated for specifically.
FRT is used across society in various forms, from phones opening for their owners via selfie-verification to ports of entry verifying someone’s face against their biometric passport, to law enforcement.
In terms of how the gardaí will use the technology, software would be used to detect faces in video footage or other visual media, and then those faces would be compared to a gallery of images — in this case a database of facial images of the population.
Here we hit our first stumbling block — the gardaí have access to a DNA database for those with criminal records, but what database of the country’s faces exists?
The only one that immediately springs to mind is that of the Public Services Card, and should that happen civil liberties organisations are likely to be less than impressed.
The answer, according to a department spokesperson, lies in images and video that have already been gathered and are legally held by the gardaí such as the mugshots of detained suspects.
How comprehensive a solution that will be remains to be seen, given the vast majority of the population has never come to the notice of gardaí.
There are other issues with the technology, unfortunately, ones which are not new and which were already trawled through when FRT was first considered by Government in 2022.
For starters, the tech in its current guise is by its very nature imprecise. It seeks to find matches, but all it can do is deliver a probability of a match.
Given the speed at which modern day tech moves, FRT is something which some day may be absolutely perfected. But as things stand, it isn’t there yet.
Secondly, more problematically, it has a bad habit of identifying people incorrectly.
Worse again, it has a habit of misidentifying people of colour in particular.
In 2020, head of the Detroit Police Department Chief James Craig admitted that “if we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify”, though he added that using the tech exclusively without any form of human oversight was “against our current policy”.
None of this has stopped other police forces from using FRT — it’s in place in at least 18 European countries, and across wide parts of the US.
However, its use is also banned in other places, including San Francisco, ordinarily the tech capital of America, off the back of privacy concerns.
A third issue is that the use of FRT is likely going to be a legal minefield.
Ireland has already fallen out of step with European law regarding tech/privacy projects on multiple occasions, including with the PSC.
In 2014, our data retention laws were found to be incompatible with European law and then not amended for the guts of a decade, leaving thousands of convictions, including that of architect-murderer Graham Dwyer, in limbo.
The European Parliament has been considering its own legislation on FRT for many months now.
When it will be finalised is anyone’s guess, though the odds are decent that it won’t be before next year’s European elections.
In the absence of that law, the European Data Protection Board (EDPB — the oversight body for the bloc’s various privacy regulators, including our own Data Protection Commission) provided a series of guidelines last April for prospective laws in EU countries enabling the use of FRT in law enforcement.
A key point in those guidelines is that the technology should only be employed in cases of “strict necessity”, that is where there is literally no other way to get a job done.
That definition does not apply, for example, to analysing CCTV footage of a riot using facial recognition software — a job which human gardaí have been doing for years — at least not in a way that could be used officially to get a conviction.
Interestingly, frontline gardaí themselves (notwithstanding Commissioner Drew Harris who is also under a great deal of pressure at present) don’t seem overly pushed about FRT, with rank-and-file representative body the GRA declining to comment on the tech, with one source saying “it’s not something we’ve discussed”.
Officers body the AGSI is more supportive, with general secretary Antoinette Cunningham saying they supported Commissioner Harris’s belief in its usefulness for operational policing, adding that there are no concerns “that can’t be addressed in sound robust legislation on its use”.
While a new law does now appear to be on the way, its passage isn’t likely to be smooth.
Labour’s Aodhán Ó Ríordáin was quick out of the blocks after Ms McEntee announced she would be prioritising FRT once more stating there has been a “consistent pattern” of announcing legislation “to distract people from the key issues of a lack of resources for frontline workers like gardaí”.
Olga Cronin, senior policy officer with the Irish Council for Civil Liberties (ICCL) said that the new law’s mooted aim of widening the scope of cases where gardaí can make use of FRT equates to “mission creep before we even have a mission”.
“FRT is inherently flawed and dangerous. It has been trained on pale, male faces and so it has a racial element.
“People have been misidentified, detained, questioned wrongfully because of it,” she said.
“Its use will mean we’re entering a 100% surveillance state.
“You won’t be able to go where you want to go without being watched.
“And what’s worse, it won’t do what the minister wants it to do. It wouldn’t have stopped the riot that happened.
“Most of those guys had their faces covered. Identification wasn’t the issue, the ability to stop a riot was the issue. The proposed benefits do not outweigh the risks,” Ms Cronin added.
Privacy solicitor Simon McGarr meanwhile said that far from making life easier for the gardaí, FRT would simply serve to “multiply the burden on policing”.
“It is akin to running a hotline where 99% of the tips turn out to be false, but every one of them has to be investigated,” he said.
“We saw with ChatGPT that these kinds of technologies are quite capable of confidently asserting something that is not true.
“It’s one thing when such an assertion results in a citation in a college essay of a source that doesn’t exist.
“It’s a great deal more serious when it can lead to potential criminal prosecutions,” he added.