Ireland's Data Protection Commission has imposed TikTok's largest-ever GDPR penalty for systemic failures in protecting minors' data, including default public accounts for under-18s and inadequate age verification.

The Irish Data Protection Commission (DPC) has levied a €345 million ($370 million) fine against TikTok for violating the EU's General Data Protection Regulation (GDPR), marking the platform's largest privacy penalty to date. The decision (DPC Case Reference IN-20-7-1) stems from a 2021 investigation into TikTok's handling of children's accounts between July 2020 and December 2021.
What Happened
TikTok was found to have:
- Set under-18 accounts to public by default, exposing minors' content/locations
- Implemented inadequate age verification systems, enabling adult contact with children
- Failed to provide transparent privacy notices as required under GDPR Articles 12-14
- Allowed "Family Pairing" feature without verifying parental relationships
Legal Basis
This enforcement action invokes multiple GDPR provisions:
- Article 5(1)(c): Data minimization violations
- Article 5(1)(f): Insufficient security safeguards
- Article 24(1): Failure to implement privacy-by-design
- Article 25(1): Default settings exposing minors' data
The DPC specifically cited TikTok's violation of the Children's Online Privacy Protection Rule (COPPA) equivalent standards under GDPR's Article 8 on child consent.
Impact on Users
- 13.5 million EU minors had accounts during the violation period
- Public default settings led to documented cases of child grooming and harassment
- Age estimation flaws permitted under-13 usage despite platform restrictions
Corporate Consequences
- TikTok must implement age-gating improvements within 3 months
- Required to reset privacy defaults for all minor accounts to private-by-default
- Mandatory third-party audits of minor protection systems
- Brings TikTok's total GDPR fines to €425 million since 2021
Broader Compliance Implications
This ruling establishes critical precedents:
- Default settings now recognized as high-risk processing under GDPR
- Platforms must use robust age assurance beyond self-declaration
- "Dark patterns" in minor-facing interfaces face increased scrutiny
The decision aligns with tightening global regulations, including:
- UK's Age-Appropriate Design Code
- California's Age-Appropriate Design Code Act (AB 2273)
- Proposed updates to the US Children's Online Privacy Protection Act (COPPA)
TikTok's official response commits to design changes but disputes the DPC's calculation methods. Digital rights groups like noyb argue the penalty should have been higher given TikTok's 150 million EU users.
This enforcement signals regulators' growing impatience with tech platforms' privacy negligence toward minors, likely triggering similar actions against Meta, YouTube, and Snapchat as child protection becomes GDPR's enforcement priority.

Comments
Please log in or register to join the discussion