The UK government has introduced draft legislation requiring tech platforms to remove nonconsensual abusive content like deepfake nudes within 48 hours, threatening fines up to 10% of global revenue and service blocking for non-compliance.

The UK government has unveiled draft legislation imposing stringent requirements on technology platforms to combat nonconsensual intimate imagery and abusive content. Under the proposed rules, companies would be required to remove flagged content within 48 hours of notification or face penalties including fines of up to 10% of global annual revenue and potential service blocking in the UK market.
Prime Minister Rishi Sunak characterized the measures as a response to a "national emergency" of online misogyny, citing research indicating that deepfake pornography and nonconsensual intimate imagery have surged by over 200% since 2023. The regulations specifically target:
- Sexually explicit deepfakes
- Nonconsensual sharing of intimate images
- Digitally altered abusive content
Ofcom, the UK's communications regulator, would gain expanded enforcement powers under the proposal. The regulator could issue fines through a new civil penalty system and require internet service providers to block non-compliant platforms. The 48-hour removal window applies specifically after platforms receive "valid notification" of violating content.
Technical implementation challenges emerge in several areas:
- Content identification: Automated detection of synthetic media remains unreliable, with current systems showing high false-positive rates for legitimate artistic or journalistic content. Deepfake detection accuracy hovers around 78% in controlled environments but drops significantly with adversarial examples.
- Verification protocols: The legislation doesn't define standards for validating user reports, creating potential vectors for abuse through fraudulent takedown requests.
- Small platform burden: While major platforms have content moderation infrastructure, smaller services could face disproportionate compliance costs. The draft exempts services with under 1 million UK users from certain requirements.
Legal experts note potential conflicts with the UK's existing Online Safety Act, which already mandates removal of illegal content "as soon as reasonably practicable." The new 48-hour hard deadline creates a stricter standard specifically for intimate imagery. Free speech advocates warn the rules could incentivize over-removal, particularly given vague definitions around "digitally altered" content.
The proposal enters a 12-week consultation period before parliamentary debate. If enacted, enforcement would begin in Q1 2027. Platforms would be required to implement real-time reporting dashboards accessible to UK law enforcement and fund awareness campaigns about reporting mechanisms.
Notably absent are technical specifications for content detection systems or provisions addressing cross-border jurisdictional challenges when content originates outside the UK. The government's impact assessment acknowledges potential increased operational costs for platforms but argues these are justified by societal harm reduction.
Full details of the proposal are available in the government consultation document.

Comments
Please log in or register to join the discussion