The UK’s equality watchdog has called for the use of live facial recognition technology (LFR) to be halted amid concerns over discrimination and free expression.
The Equality and Human Rights Commission (EHRC) has said the technology must have its impact independently verified.
LFR was first piloted by police in south Wales before being rolled out across London by the Metropolitan Police earlier this year.
In a report to the United Nations, the EHRC warns the technology may “infringe upon privacy rights and result in self-censorship, with a consequent chilling effect on freedom of expression and association”.
Evidence also indicates that the cameras “disproportionately misidentify Black people and women, and therefore operate in a potentially discriminatory manner”.
Rebecca Hilsenrath, the EHRC’s chief executive, said: “With new technology comes new opportunities and new, more effective ways of policing crimes and protecting UK citizens.
She added: “But these also bring new challenges and new risks which we need to meet in order to use any such technology effectively for the good of the community as a whole.”
Met Commissioner Dame Cressida Dick defended the use of the technology during a speech last month.
She said: “In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through live facial recognition and not being stored feels much, much smaller than the public’s vital expectation to be kept safe from a knife through the chest.”
“If, as seems likely, algorithms can assist in identifying patterns of behaviour by those under authorised surveillance … that indicate … a terrorist attack [or] a potential serial rapist or killer … then I think almost all citizens would want us to use it.”
Human rights group Liberty described her comments as “dangerous and misleading”.
- The EHRC’s 95-page report contains covers a broad swathe of areas – ranging from immigration detention to the Grenfell Tower disaster.