Meta Deletes 550,000 Accounts As Australia Enforces Under-16 Social Media Ban

0
Meta

Meta Platforms Inc. has removed nearly 550,000 user accounts in Australia as part of its enforcement of the country’s new law restricting children under 16 from using mainstream social media platforms. The deletions affect Meta-owned services, including Instagram, Facebook, and Threads.

The Australian legislation, which came into force on December 10, 2025, aims to shield minors from algorithm-driven content and other online risks. Meta began proactively enforcing the rules a week prior, removing approximately 330,000 under-16 users from Instagram, 173,000 from Facebook, and 39,000 from Threads.

BrandSpur Brand News reports that the Albanese government is expected to release nationwide data on underage account removals across all platforms in the coming days. Meta has publicly voiced concerns that the law may inadvertently isolate teenagers and push them toward less regulated digital spaces.

In a company blog post, Meta criticised the age-verification methods mandated under the legislation, describing them as inconsistent and questioning whether the policy achieves its stated goal of improving youth safety. The company highlighted that logged-out users on some platforms may still be exposed to algorithmic content, undermining the law’s underlying premise.

Also read: https://brandspurng.com/2026/01/12/italy-fines-cloudflare-e14m-as-company-defies-piracy-shield-over-dns-blocking/

Despite these objections, Meta confirmed it will continue complying with Australian law to avoid penalties of up to A$50 million for non-compliance. The rules cover a wide range of platforms, including Snapchat, TikTok, X, YouTube, Reddit, Twitch, Threads, and Kick, though exemptions exist for services focused primarily on gaming, health, or education.

Meta has urged the government to collaborate with tech companies on safer, more age-appropriate alternatives rather than enforcing blanket bans. The company also warned that the legislation’s 30-minute content-blocking requirement risks overblocking legitimate users and could disrupt internet services beyond the intended targets.

The Piracy Shield-style enforcement mechanism, coupled with limited verification windows, has raised concerns among tech advocacy groups and industry associations about free expression and digital rights. Meta has also stated it will engage with US authorities and remain open to discussions with Italian and European regulators on similar issues.

As Australian authorities continue to monitor compliance, the Meta action underscores the growing tension between tech companies and governments implementing stringent digital safety regulations for minors.