The UK Ministry of Defence's £86 million deployment of AI-equipped battlefield gear accelerates targeting decisions but lacks transparency on civilian data safeguards under GDPR.

The British Army is deploying £86 million worth of AI-enabled battlefield equipment designed to accelerate combat decisions, raising significant questions about civilian data protection and compliance with privacy regulations. The Dismounted Data System (DDS) – including AI-capable radios, headsets, and tablets – processes voice and visual data to identify enemies faster, reducing targeting decisions from hours to minutes according to the Ministry of Defence (MoD).
Part of Project ASGARD, the system integrates weapon systems, surveillance equipment, and automated technology to "destroy enemies far beyond the horizon." Field-tested in Estonia, the technology aims to provide soldiers with "precise information on surroundings" while reducing battlefield distractions. However, the MoD has not disclosed how the system complies with:
- GDPR Article 9 protections prohibiting processing of biometric data without explicit consent, given the system's visual identification capabilities
- CCPA-style rights for civilians inadvertently captured in battlefield surveillance
- Algorithmic accountability requirements under the EU AI Act for high-risk military systems
Military AI expert Dr. Emilia Vance notes: "When systems process facial recognition or environmental data in populated areas, they inevitably capture civilian information. The absence of public safeguards creates risks of unchecked profiling and potential misuse of personal data under the guise of national security exemptions."
The MoD’s silence on data retention policies is particularly concerning. Unlike commercial AI deployments requiring data minimization and purpose limitation under GDPR, battlefield systems operate without clear constraints. This could expose the UK government to future legal challenges regarding unlawful processing of EU citizens' data.
Project ASGARD’s inclusion of loitering drones (DART 250) compounds these issues by extending surveillance range into civilian-adjacent areas. With NATO rapidly adopting similar technologies following Ukraine conflict lessons, the lack of transparent frameworks risks normalizing extra-legal data processing in conflict zones.
The MoD's 2025 admission that it wasn't "AI-ready" despite policy documents suggests ongoing compliance gaps. As General Sir Roly Walker champions the system’s combat effectiveness, rights advocates demand equal emphasis on ethical data handling protocols meeting international standards.

Comments
Please log in or register to join the discussion