London’s Met announced the first use of live facial‑recognition (LFR) cameras on a political rally. A look at the claimed safety benefits, the underlying technology, and why the rollout raises serious civil‑liberty questions.
What the Met announced
Tomorrow the Metropolitan Police will activate live facial‑recognition (LFR) cameras on lampposts in Camden during the “Unite the Kingdom, Unite the West” rally. The deployment is described as a pilot of the same system that the Met ran on static street furniture in Croydon from October 2025 to March 2026. A deputy assistant commissioner said the move is justified by “intelligence indicating a likely threat to public safety.”
The announcement also notes that a separate pro‑Palestinian march on the same day will not be subject to LFR, prompting accusations of a two‑tier approach to public safety.
The technology behind the claim
Live facial‑recognition systems consist of three parts:
- Image capture – high‑resolution cameras mounted on existing infrastructure (lampposts, traffic lights, etc.).
- Feature extraction – a neural network (often based on ResNet‑50 or similar backbones) converts each face into a 128‑dimensional embedding.
- Matching – the embedding is compared against a watchlist stored in a police database, typically using cosine similarity with a threshold tuned to balance false‑positives and false‑negatives.
The Met’s Croydon pilot used the Clearview‑style vendor IDVision (see their product page). The system runs on on‑premise GPUs that process up to 30 fps per camera, sending only similarity scores to a central console; raw images are supposed to be deleted after the match.
Reported performance vs. reality
| Metric (Croydon pilot) | Reported by Met | What the math shows |
|---|---|---|
| Faces scanned | 470,000 | ~2,600 faces per day, ~0.03 % of the borough’s population per day |
| Arrests | 173 | 1 arrest per 2,718 scans |
| Crime drop claim | 10.5 % overall, 21 % drop in violence against women and girls | No control group disclosed; the drop coincides with a broader seasonal decline in recorded offences |
The Met highlighted a “one‑arrest‑every‑35‑minutes” headline, but the underlying ratio (≈0.04 % of scans leading to an arrest) is more telling. For every 100 people whose faces are captured, 99.96 % are not linked to any criminal activity. In practice, the system is a mass‑surveillance filter that produces a large number of false‑positives which must be manually reviewed by officers.
Legal and procedural gaps
- No parliamentary authorisation – Live facial‑recognition has never been the subject of primary legislation in the UK. Police forces rely on internal policies, which are not subject to the same scrutiny as a statutory framework.
- Data‑retention concerns – The Met says images are deleted “moments later,” but audits of similar systems in the US have found logs of raw frames retained for weeks. Without an independent audit trail, compliance is hard to verify.
- Selective deployment – Applying LFR to one rally but not another raises equal‑protection questions. The current policy does not define clear criteria for when a protest is deemed a “threat.”
Practical implications for demonstrators
- Chilling effect – Knowing a camera will compare your face to a watchlist may deter participation, even if you have no criminal record.
- Mistaken identity – False‑positives can lead to temporary detentions, questioning, or being placed on a watchlist without due process.
- Data permanence – Even if the system deletes images, the fact that a biometric record was created is itself a data point that could be subpoenaed later.
How this fits into the broader trend
The shift from mobile vans to fixed infrastructure mirrors deployments in other cities (e.g., Chicago’s “Strategic Subject List” cameras). Fixed cameras are harder for the public to spot, can operate continuously, and reduce the logistical cost of a police‑run van fleet. However, they also embed surveillance into the urban fabric, making it harder for citizens to avoid capture.
What to watch next
- Parliamentary inquiry – A cross‑party committee is expected to request evidence on the Croydon pilot in the coming weeks.
- Legal challenges – Civil‑rights groups have already filed a judicial review claim arguing that the deployment breaches Article 8 of the European Convention on Human Rights (right to privacy).
- Technical audits – Independent researchers have begun requesting the Met’s source code under the UK’s Freedom of Information Act; the response will indicate how transparent the system truly is.
Bottom line
The Met’s claim that LFR “keeps London safe” rests on a modest arrest‑to‑scan ratio and an unverified crime‑reduction figure. The technology itself works as advertised—matching faces against a database in real time—but the policy framework, oversight mechanisms, and impact on democratic participation remain under‑developed. Until clear statutory limits and transparent audit processes are in place, each new deployment should be treated as a test of how far biometric surveillance can expand, not as evidence of a proven public‑safety tool.

For further reading on the Met’s Croydon pilot, see the official report here.

Comments
Please log in or register to join the discussion