Australia is expanding its age verification requirements to include AI apps, potentially forcing Apple to block non-compliant services from its App Store starting March 9.
Australia is expanding its digital safety regulations beyond social media, with new age verification requirements that could force Apple to block certain AI apps from the App Store. The move comes as the country tightens restrictions on services that may expose young users to harmful content.
Australia's Expanding Digital Safety Framework
The Australian government's latest regulatory push builds on last year's landmark decision to ban social media access for teenagers. That initial move made Australia the first nation to implement such restrictions, driven by growing concerns about social media's impact on youth mental health. The debate has intensified following works like Jonathan Haidt's "The Anxious Generation," which examines the correlation between smartphone use and rising anxiety rates among young people.
Now, starting March 9, AI platforms including services from OpenAI and other major providers must implement age verification systems to prevent users under 18 from accessing content related to pornography, extreme violence, self-harm, or eating disorders. The regulations also address concerns about excessive chatbot usage among teens, particularly regarding emotionally manipulative design features that could foster dependency.
The Scope of the New Requirements
According to Australia's eSafety Commissioner, there are mounting concerns about how AI companies design their products to engage young users. "eSafety was 'concerned that AI companies are leveraging emotional manipulation, anthropomorphism and other advanced techniques to entice, entrance and entrench young people into excessive chatbot usage,'" a spokesperson told Reuters.
The regulator has reported cases of children as young as 10 spending up to six hours daily interacting with AI-powered tools, raising alarms about potential psychological impacts during crucial developmental periods.
Apple's Position and Potential Compliance Challenges
When Reuters reached out for comment on the new requirements, Apple declined to respond. However, the company has been implementing age-related safeguards across its platforms to comply with various international age-restriction laws. These systems often rely on signals automatically detected by devices, though the responsibility for implementing specific compliance measures ultimately falls on individual developers.
The new rules could create a significant compliance challenge for Apple's App Store. Regulators may require app stores and search engines to block access to AI services that fail to verify user ages properly. This would place Apple in the position of potentially having to remove or restrict access to popular AI applications that don't meet Australia's requirements.
Widespread Non-Compliance Among AI Services
Reuters' investigation revealed that compliance with the new regulations remains limited. Among the 50 most popular text-based AI tools analyzed, the majority showed no clear steps toward implementing age verification or content filtering ahead of the March 9 deadline. This widespread non-compliance could result in a significant number of AI applications becoming unavailable to Australian users through official app distribution channels.
Global Implications for Tech Regulation
Australia's approach represents a growing trend of governments taking active roles in regulating AI access for minors. The country's willingness to potentially block entire categories of applications from major platforms like Apple's App Store signals a more aggressive regulatory stance that could influence similar efforts in other jurisdictions.
For Apple, this presents another layer of complexity in managing its global App Store operations. The company must balance compliance with local regulations while maintaining the availability of popular services for its users. As AI applications become increasingly integrated into daily digital life, tech companies face mounting pressure to implement robust age verification systems that satisfy both regulatory requirements and user privacy concerns.
The coming weeks will reveal how many AI service providers can meet Australia's compliance deadline and whether Apple will need to take action against non-compliant apps in its Australian App Store. The outcome could set precedents for how other countries approach AI regulation and how tech platforms respond to increasingly specific content access requirements.

For the complete Reuters report on Australia's new AI app regulations, visit their website.

Comments
Please log in or register to join the discussion