Police use of facial recognition ruled unlawful in the UK
The groundbreaking judgement finds the surveillance tool breaches rights
Use of facial recognition technology for law enforcement purposes has been deemed unlawful because it violates the fundamental right to privacy, according to a landmark legal ruling.
The Court of Appeal has declared that the technology, which is capable of scanning 50 images a second and matching them with a ‘watchlist’ that can hold up to 2,000 images of individuals, violates human rights, data protection laws, and equality laws.
The long-running trials deployed by South Wales Police, and other forces in the UK, must come to an end as a result.
The case was brought forward by Ed Bridges, backed by the human rights group Liberty, after he was scanned by the AFR Locate system deployed by South Wales Police between May 2017 and April 2019.
Bridges had initially argued that his image was recorded by the system, albeit deleted almost immediately afterwards, and therefore violated his right to respect for private life under Article 8 of the European Convention on Human Rights, among other crucial legislation.
The High Court, to which he initially appealed for a judicial review against the use of the technology, ruled against him in September 2019. Following a challenge to the judgement, however, the Court of Appeal agreed with Bridges’ appeal on three of five grounds, ruling the use of the system as unlawful.
“This judgment is a major victory in the fight against discriminatory and oppressive facial recognition,” said Liberty lawyer Megan Goulding.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
“The Court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the Court found that South Wales Police had failed in their duty to investigate and avoid discrimination.
“It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it needs to be banned.”
South Wales Police had scanned approximately 500,000 faces, according to the Court of Appeal, and had attempted to match them against individuals on a watchlist of between 400 and 800 people at any one time. These included people on warrants, those who had escaped from custody, and suspects of crimes, as well as people who may need protection.
The court said there aren't any safeguards governing who was included on a watchlist and where the technology would be used, adding too much discretion is afforded to individual police officers.
The court also added in its judgement that South Wales Police had never sought to establish on clear terms, either by themselves or through independent checks, that the software used did not exert an unacceptable racial or sexual bias.
The technology has come under intense scrutiny for its use in law enforcement, although the backlash has largely been based in the US, not the UK, especially in light of the resurgence of the Black Lives Matter movement.
There have been concerns that the use of facial recognition by police forces leads to demonstrable and damaging racial discrimination that go unchecked. This led a host of companies to suspend or drop the selling of the technology, at least until the fundamental issues at the heart of the race relations debate were addressed.
“I’m delighted that the Court has agreed that facial recognition clearly threatens our rights,” said Ed Bridges.
“This technology is an intrusive and discriminatory mass surveillance tool. For three years now South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance."
The use of facial recognition technology in the UK is at a much less advanced stage, with a small handful of police forces only conducting trials. Concerns, however, have previously prompted the Information Commissioner’s Office (ICO) to compel police forces to examine the data protection risks and eradicate racial bias from the software or expose themselves to potential data protection violations, and subsequent action.
Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.