Australia's Social Media Ban for Kids: Big Tech's Compliance Falls Short
#Regulation

Australia's Social Media Ban for Kids: Big Tech's Compliance Falls Short

Regulation Reporter
3 min read

Australia's eSafety Commission finds major social media platforms failing to enforce the country's ban on under-16 users, with millions of accounts blocked but significant gaps remaining in age verification and account removal processes.

Australia's eSafety Commission has delivered a damning first assessment of how major social media platforms are handling the country's groundbreaking ban on children under 16 using their services, finding widespread failures in compliance despite some progress.

Featured image

The regulator revealed it is "moving into an enforcement stance" after discovering that Meta, YouTube, TikTok, and Snapchat have collectively blocked around five million accounts attempting to circumvent the minimum age requirement. However, the investigation uncovered troubling patterns of non-compliance that have prompted formal investigations into five major platforms.

Poor Practices Exposed

The commission's report details several concerning behaviors by social media companies. Some platforms have been messaging children under 16, encouraging them to attempt age verification even when they've already declared themselves underage. In other cases, platforms allowed children to repeatedly try the same age assurance method until they achieved a 16+ outcome.

Perhaps most troubling is the finding that reporting mechanisms for age-restricted accounts are generally inaccessible and ineffective, particularly for parents trying to protect their children. The commission also noted that some platforms appear not to have done enough to prevent children under 16 from creating accounts in the first place.

A Case Study in Failure

One particularly revealing example involved a 12-year-old who signed up for a social media account two years ago, falsely claiming to be 14. Now aged 14, the platform believes the user is 16. When a parent attempted to have the account closed, the platform requested a legal letter to prove parental status—a costly and burdensome requirement that the parent chose not to pursue. The result: a 14-year-old continues to use an account they shouldn't have access to.

Formal Investigations Underway

Based on these findings, eSafety is investigating potential non-compliance by five platforms: Snap, TikTok, Facebook, Instagram, and YouTube. The regulator aims to complete these investigations and decide on enforcement action by mid-2026.

"These investigations will require giving further legally enforceable information-gathering notices to assess whether the steps taken by platforms are reasonable, identifying gaps, and assessing the totality of all steps taken by a platform," the report states. The commission has made clear it "will not hesitate to take enforcement action where it has sufficient evidence of non-compliance," including the possibility of civil penalty proceedings.

Global Implications

Australia's social media ban has become a template for other nations grappling with the impact of social media on young people. Indonesia, the world's fourth-most-populous country, recently enacted a similar ban, and several more nations are working on comparable regulations.

The eSafety report contains findings that may reassure other regulators considering similar measures. Contrary to concerns that bans might drive children to unregulated platforms, the commission observed only "short-term increases in downloads of some emerging apps" but "no significant migration to non-compliant platforms or other online services that are not required to comply with the SMMA obligation."

The report suggests this is because "the profusion of online services young people may be migrating to do not have a critical mass of their peers established on these smaller, less entrenched services."

The Enforcement Challenge

Australia's experience highlights the fundamental difficulty of enforcing age restrictions in the digital age. While blocking five million accounts demonstrates some effectiveness, the persistence of workarounds and the burden placed on parents to enforce restrictions reveal the limitations of current approaches.

The coming months will be crucial as eSafety decides whether to pursue civil penalties against non-compliant platforms. The outcome will likely influence how other nations approach similar regulations and whether social media companies can effectively implement age restrictions without more fundamental changes to their business models.

For now, Australia's experiment with social media regulation continues, with the eSafety Commission signaling it's ready to move from monitoring to enforcement—a shift that could have significant implications for how Big Tech operates in the country and potentially worldwide.

Comments

Loading comments...