Meta has removed end‑to‑end encryption from Instagram direct messages, citing low adoption and steering users to WhatsApp. The move re‑opens private chats to Meta’s servers, raising GDPR, CCPA and other privacy compliance questions, and prompting warnings from digital‑rights groups about surveillance, data‑leak risk and potential regulatory penalties.

What happened Meta announced today that the optional end‑to‑end encryption (E2EE) feature for Instagram Direct Messages will be discontinued. The company says only a small fraction of users opted in, and it will instead encourage those who need encryption to migrate to WhatsApp, where the feature remains active. Effective immediately, new Instagram DMs will be stored in plaintext on Meta’s servers, and the company has not clarified the fate of previously encrypted conversations.
Legal basis Under the EU General Data Protection Regulation (GDPR), personal data must be processed "in a manner that ensures appropriate security of the data" (Article 5(1)(f)). Encryption is recognised as an appropriate technical measure (Recital 83). By removing a security layer that many users relied on, Meta risks breaching its GDPR obligations unless it can demonstrate an equivalent safeguard.
In the California Consumer Privacy Act (CCPA), businesses must implement reasonable security procedures to protect personal information (Section 1798.150). The CCPA does not prescribe encryption, but courts have treated the lack of encryption for sensitive data as a failure to meet the “reasonable” standard, especially when the data is widely shared across platforms.
Both regimes also impose data‑subject rights: users can request deletion, access, or correction of their messages. If encrypted messages become readable by Meta, the company must be prepared to honor these rights without undue delay, or risk penalties of up to €20 million or 4 % of global turnover under GDPR, and up to $7,500 per violation under CCPA.
Impact on users and companies
- Users – Journalists, human‑rights defenders, survivors of abuse, and other high‑risk groups lose a rare privacy shield on a platform they already use for public outreach. Plaintext storage makes their conversations vulnerable to internal misuse, external hacking, or lawful government requests.
- Meta – The company now faces potential supervisory‑authority investigations in the EU and California. If regulators determine that the removal of E2EE constitutes a material downgrade of security, Meta could be hit with hefty fines and be forced to re‑introduce encryption or provide an alternative safeguard.
- Advertisers and third‑party developers – Meta has previously disclosed that AI‑generated insights from private chats may be used for ad targeting. With encryption gone, the line between private conversation and data‑driven advertising becomes blurrier, raising additional compliance questions under GDPR’s purpose‑limitation principle.
What changes are required
- Conduct a Data Protection Impact Assessment (DPIA) – Meta must document why the removal of E2EE is necessary, assess residual risks, and identify mitigations such as server‑side access controls, audit logs, and strict internal use policies.
- Offer an equivalent security measure – If encryption cannot be reinstated, Meta should provide at‑least a strong transport‑layer encryption combined with server‑side key‑management that limits employee access, thereby satisfying the “appropriate technical and organisational measures” test.
- Transparent user communication – GDPR Article 12 requires clear, concise information about how data is processed. Meta should publish a detailed FAQ explaining what happens to existing encrypted chats, the retention schedule, and the rights users can exercise.
- Update contracts with processors – Any third‑party services that handle Instagram messages must now be bound by stricter clauses reflecting the loss of end‑to‑end protection.
- Prepare for regulator inquiries – Meta should designate a GDPR liaison, retain evidence of the low adoption rate claim, and be ready to demonstrate that the decision does not disproportionately affect EU or California residents.
Broader context The decision mirrors earlier push‑backs from law‑enforcement agencies that argued wide‑scale encryption hampers child‑protection investigations. However, privacy advocates stress that a blanket removal of encryption is a disproportionate response that harms all users, not just the few involved in illicit activity. The Center for Democracy & Technology and the Global Encryption Coalition have already filed formal objections, citing the risk of mass surveillance and the chilling effect on free expression.
What to watch next Regulators in the EU (e.g., the Irish Data Protection Commission, which acts as Meta’s lead supervisory authority) and the California Attorney General’s office are expected to open inquiries within the next 30 days. If fines are imposed, they could reach the upper limits of GDPR penalties, potentially exceeding €100 million given Meta’s global turnover.
For users who require strong privacy, the safest interim step is to migrate conversations to WhatsApp or another platform that still offers verified end‑to‑end encryption, such as Signal or Proton Mail. Companies that embed Instagram messaging into their customer‑service workflows should audit their data‑handling practices immediately to ensure continued compliance.

Comments
Please log in or register to join the discussion