An activist who was filmed using controversial facial recognition technology has lost the world’s first ever legal challenge against its use by police.
Campaigner Ed Bridges argued that his right to privacy had been breached when his face was scanned by South Wales Police cameras while he was out shopping in Cardiff.
But High Court judges on Wednesday (4 September) ruled that, while this mass surveillance tool interferes with privacy rights, is it not unlawful.
Bridges, represented by human rights charity Liberty, has said that he will appeal the court’s decision.
A judgment issued by Lord Justice Charles Haddon-Cave and Mr Justice Jonathan Swift at Cardiff High Court dismissed Bridges’ challenge “on all grounds”.
“At the heart of this case lies a dispute about the privacy and data protection implications of [Automated Facial Recognition] (AFR),” it reads.
The judges conclude that they “are satisfied both that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate,” adding: “South Wales Police’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act, and the data protection legislation”.
More than 500,000 faces scanned
Image Credit: Pixabay.
The court’s key findings include:
- South Wales Police had complied with equality laws.
- The force’s use of AFR meets the requirements of the Human Rights Act.
- The current legal framework – the Data Protection Act 2018, the Surveillance Camera Code of Practice and force’s local policies – are adequate in governing when and how AFR can be used.
- The force was processing personal data but in a lawful manner that met the conditions set out in the 1998 Data Protection Act.
- The court said it was “open to question” whether South Wales Police’s policies comply with 2018 Data Protection Act – but refused to decide on the matter as further guidance is expected from the Information Commissioner’s Office.
Reacting to the decision, Bridges said: “South Wales Police has been using facial recognition indiscriminately against thousands of innocent people, without our knowledge or consent.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance.”
Liberty has set up an online petition calling on Home Secretary Priti Patel to ban the technology outright – gaining more than 5,000 signatures so far.
- 10 Reasons Why You Should Be Worried About Facial Recognition Technology
- King’s Cross Facial Recognition Is ‘Potential Threat To Privacy’
AFR technology works by using special cameras to scan the structure of faces in a crowd of people.
It creates a digital image of a person which, using algorithms, is compared with a database of people on the police’s “watch list” – those who have been previously taken into custody. People on this list may have been arrested previously or may have been reported as missing. When a match is found, officers in the area are alerted.
South Wales Police is believed to have used facial recognition in public spaces more than 50 times since May 2017, backed by funding from the Home Office, according to Liberty.
More than 500,000 people are reported to have had their faces scanned without their knowledge or consent at events including the 2017 Champions League Final, the Six Nations rugby matches and Ed Sheeran concerts.
A Freedom of Information request, published by civil liberties campaign group Big Brother Watch earlier this year, revealed that 96 percent of facial recognition matches misidentified innocent members of the public when the Metropolitan police deployed the same technology eight times between 2016 and 2018.
Not The End Of ‘Wider Debate’
Image Credit: Pixabay.
South Wales Ch Con Matt Jukes told the BBC the decision was welcome but not the end of the “wider debate”.
“I recognise that the use of AI and face-matching technologies around the world is of great interest and, at times, concern.
“So, I’m pleased that the court has recognised the responsibility that South Wales Police has shown in our programme. With the benefit of this judgment, we will continue to explore how to ensure the ongoing fairness and transparency of our approach.”
Want to learn more on this topic?
- Read why campaigners say facial recognition tech needs regulation.
- Take a look at reports of gender and racial biases in facial recognition technology.
- Learn more about digital freedom and your rights.