London's Metropolitan Police Service reports one arrest every 35 minutes during its six-month facial recognition camera trial, but civil liberties groups raise concerns about privacy and accuracy despite claims of minimal false positives.
London's Metropolitan Police Service (MPS) is touting its six-month trial of static live facial recognition (LFR) cameras as a major success, claiming the technology helped secure an arrest every 35 minutes. Between October 2025 and March 2026, the 24 operations conducted using the fixed cameras resulted in 173 arrests, including individuals suspected of serious crimes and those who had evaded law enforcement for decades.
The trial, conducted in Croydon with two permanently installed cameras at either end of the town's High Street, targeted individuals wanted by the courts. Among those arrested was a 36-year-old woman who had been wanted since 2004 for failing to appear at court for an assault charge. A 31-year-old man wanted for voyeurism and a 41-year-old man suspected of rape in November 2025 were also apprehended.
"We will continue using static cameras in Croydon as part of our regular live facial recognition deployments, which play a vital part in keeping London safe," stated Lindsey Chiswick, national and Met lead for LFR. "These results show why live facial recognition is such a powerful tool when it's used carefully, openly, and in the right places."

The Legal Framework
The MPS's use of facial recognition technology operates within a complex legal landscape. In the UK, the primary legislation governing police powers and data protection includes the Police and Criminal Evidence Act 1984, the Data Protection Act 2018, and the Human Rights Act 1998. These laws set parameters for how police can collect and process biometric data.
Notably, the use of facial recognition by police has been subject to legal challenges. In 2020, the Court of Appeal ruled that South Wales Police's use of automated facial recognition was unlawful, finding that it violated privacy rights under the European Convention on Human Rights. This decision established important precedents for the use of biometric surveillance in law enforcement.
The UK's Information Commissioner's Office (ICO) has also issued guidance on live facial recognition, emphasizing that organizations must comply with data protection principles and conduct a Data Protection Impact Assessment (DPIA) before deploying such technology.
Accuracy and Privacy Concerns
Despite the positive results cited by the MPS, civil liberties groups remain deeply concerned about the technology. Big Brother Watch, a prominent privacy advocacy organization, has labeled the permanent Croydon installations "chilling infrastructure" and regularly describes LFR as "dystopian."
The MPS reported that during the trial, more than 470,000 individuals were scanned, with only one false positive registered. However, critics point to previous research showing higher error rates, particularly for people of color and women. A 2022 study by the University of Essex found that facial recognition systems used by UK police had higher false identification rates for Black individuals compared to white individuals.
The case of Shaun Thompson, represented by Big Brother Watch in a High Court case, illustrates the potential for misuse. Thompson, an anti-knife crime campaigner, was wrongly flagged by LFR technology and subjected to an intrusive stop and search despite providing officers with his passport and bank cards. While the group lost its legal challenge against the Met's LFR use, Thompson received a settlement from the police.
Impact on Civil Liberties
The deployment of fixed facial recognition cameras raises significant concerns about the erosion of civil liberties in public spaces. Unlike mobile LFR units that are deployed temporarily with public notice, the fixed cameras in Croydon represent a more permanent surveillance infrastructure that operates with minimal public awareness.
Privacy advocates argue that such technology creates a "chilling effect" on public behavior, as individuals may self-censor or avoid certain areas knowing they are being constantly monitored. There are also concerns about the potential function creep, where technology initially deployed for serious crime prevention expands to include minor offenses.
The collection of biometric data on such a scale also raises questions about data security and retention. While the MPS claims the biometric data is deleted after matches are processed, there is limited independent verification of these claims, and the potential for data breaches remains a significant concern.
Regulatory Response and Future Implications
The UK government has thus far taken a permissive approach to police use of facial recognition technology, though legal challenges continue to shape the regulatory landscape. The Home Office has funded the development of standards for police use of facial recognition, but these remain voluntary rather than mandatory.
In contrast, the European Union is moving toward stricter regulation through its proposed Artificial Intelligence Act, which would classify certain uses of facial recognition as "high-risk" requiring strict compliance with transparency, data governance, and human oversight requirements.
For UK citizens, the expanding use of facial recognition by police represents a significant shift in the balance between security and privacy. As the technology becomes more prevalent, there is growing demand for stronger legal safeguards, including independent oversight, transparency requirements, and limits on data retention.
The MPS's announcement that it will continue using the fixed cameras in Croydon suggests that facial recognition is becoming a permanent feature of London's surveillance infrastructure. As this technology becomes more widespread, the debate over its appropriate use will undoubtedly intensify, with civil liberties groups continuing to advocate for stronger protections against potential abuse.
In the words of privacy advocates, while the technology may help solve crimes, the long-term implications for democratic society and individual freedoms require careful consideration and robust regulatory oversight that currently appears lacking in the UK's approach to police use of facial recognition.

Comments
Please log in or register to join the discussion