Meta has deactivated nearly 550,000 Instagram and Facebook accounts in Australia to comply with the country's new law prohibiting users under 16 from social media platforms.

In a sweeping enforcement action, Meta has shut down approximately 550,000 user accounts across Instagram and Facebook in Australia. This move directly responds to Australia's recently implemented ban on social media access for users under 16 years old. According to Meta's official statement, the deactivated accounts include 330,000 on Instagram and 173,000 on Facebook, representing one of the largest single compliance actions in the platform's history.
The shutdowns stem from Australia's Social Media Access Restriction Act, which took effect in late 2025. The legislation mandates platforms to implement age verification systems and remove underage users, citing concerns over mental health risks, data privacy vulnerabilities, and exposure to harmful content. Australia joins a small group of countries like Germany and France implementing such stringent youth protections, though its under-16 threshold is among the strictest globally.
Meta's technical approach combines automated detection systems and user reporting mechanisms. The company analyzes behavioral patterns, registration details, and content interactions to flag potentially underage accounts. When identified, users receive prompts to verify their age through government ID or credit card checks. Accounts failing verification face immediate deactivation. This method, while scalable, has drawn criticism from digital rights groups. Organizations like Electronic Frontiers Australia argue the verification process compromises privacy and may disproportionately impact marginalized communities lacking formal identification documents.
The account removals signal broader shifts in global platform governance. Australia's regulator has levied substantial fines against platforms for non-compliance, creating financial incentives for proactive enforcement. Meta's disclosure suggests it prioritizes avoiding penalties estimated at up to 10% of annual local revenue for violations. Meanwhile, competitors like TikTok and Snapchat face similar scrutiny, with Australia's eSafety Commissioner confirming investigations into other platforms' compliance efforts.
For affected users, the impact extends beyond lost access. Deactivated accounts forfeit photos, messages, and connections, though Meta permits data downloads within a 30-day window. Parents report frustration over teens migrating to less-regulated platforms or using VPNs to circumvent restrictions. Industry analysts observe that Australia's policy could become a template for other nations—Canada and the UK are debating similar legislation. Meta's transparency report acknowledges imperfect detection systems but emphasizes ongoing refinement of its age-assessment algorithms.
This mass removal underscores the tension between child safety objectives and digital access rights. While Australian policymakers hail the action as a win for youth protection, the long-term solution likely requires multi-stakeholder collaboration on less intrusive age verification methods. As Meta navigates these regulations, the outcome may redefine how platforms balance growth with societal expectations globally.

Comments
Please log in or register to join the discussion