Meta begins early removals ahead of new law
Meta now removes Australian children under 16 from Instagram, Facebook and Threads. The company starts this action one week before the national youth ban begins. Meta said last month that it alerted users aged 13 to 15 about closures starting on 4 December. The company expects around 150,000 affected Facebook accounts and about 350,000 impacted Instagram profiles. Threads also loses young users because access requires an Instagram login. Australia’s new law begins on 10 December and forces platforms to block under-16s. Companies face fines of up to A$49.5m if they fail to act.
Meta asks for stronger age checks in app stores
A company spokesperson told a British news outlet that compliance will remain complex and long-term. She said Meta will follow the law but wants a better and more privacy-friendly system. Meta urges governments to introduce age verification in app stores before downloads. Parents would approve access for under-16s and avoid repeated age checks inside each app. Meta said flagged teens can save posts, videos and messages before removal. Young users who think the system made a mistake can request a review and upload a short video selfie. They can also provide a driver’s licence or another official ID.
Debate grows as more platforms face the ban
The ban also covers YouTube, X, TikTok, Snapchat, Reddit, Kick and Twitch. The government says the law protects children from online harm. Critics warn that it may cut teens off from communities they rely on. Some fear young users may move to poorly monitored corners of the internet. Communications Minister Anika Wells said she expects early difficulties but aims to protect Generation Alpha. She said strong algorithms pull teens into harmful content loops. She described children as connected to a constant “dopamine drip” once they join social platforms. Wells also monitors apps like Lemon8 and Yope to track any youth migration after the ban.
Younger platforms face new scrutiny
Australia’s eSafety Commissioner asked Lemon8 and Yope to check whether the law applies to them. Yope’s chief executive said the company has not received direct questions but already completed an internal review. He said Yope works as a private messenger with no public content. He compared it to WhatsApp because users only share daily moments with trusted contacts. Reports say Lemon8 will block under-16s next week even though the law does not list the app. YouTube, first excluded then included, criticised the legislation as rushed. The company argues that removing teen accounts with parental controls will weaken platform safety.
Global interest rises as Australia tests new rules
Governments worldwide watch this new model closely. A national study found that 96% of children aged 10 to 15 use social media. Seven in ten saw harmful posts including violent material or content linked to eating disorders or suicide. One in seven reported grooming behaviour from adults or older children. More than half said they endured cyberbullying.
