This magazine takes you directly into the future!

Australia social media ban for children below 16.
Published on
15 min read

Australia’s Under 16 Social Media Ban Questioned as Children Below 16 Bypass Age Restrictions

Australian children are now able to bypass the minimum age limit introduced by social media platforms, Reuters has reported. A report conducted by the online safety regulator , eSafety Commission revealed this months before the government implements the Australia social media ban for children below 16.

Reality Check

Towards the end of last year, Australia approved a social media ban for kids below the age of 16. The ban, which is expected to take effect at the end of this year, set a high benchmark for countries across the globe.

Although social media platforms don’t allow access to persons aged 13 and below, the report shows that 80% of Australian kids aged between 8 and 12 years used the platforms last year. YouTube, TickTok, Instagram, and Snapchat are the platforms that most children use.

All social media platforms except Reddit require users to provide their date of birth when signing up. However, these platforms rely on self-declaration to verify information provided and have no other age assurance tools.

The latest eSafety report highlights results from a social media usage national survey that targeted 8 to 15 year olds. It also includes responses from different social media services including YouTube, Instagram, Snapchat, Facebook, TikTok, and Twitch. These services are owned by Google, Meta, ByteDance, and Amazon respectively.

Age Detection

The report showed that Snapchat and Instagram have over a million users aged between 13 and 17. TikTok had over 522,800, YouTube 643,600, Facebook 455,000, and Twitch 24,400. Reddit did not have a record of underage users.

These statistics raise questions about the ability of Australia’s social media regulations that have been criticized for being ill-considered and rushed by a section of politicians and industry experts. The country’s under 16 social media ban relies heavily on truthful declaration of age to enable social platforms to block under age users.

“There is still significant work to be done by any social media platforms relying on truthful self-declaration to determine age with enforcement of the government’s minimum age legislation on the horizon,” Julie Inman Grant, eSafety Commissioner said.

Social media platforms like Snapchat, Twitch, and TikTok use language analysis technology to identify under 16 users. Twitch and TikTok also use AI-driven age estimation to detect below 16 users. Instagram and Facebook have age stimate models while YouTube relies on classifiers.

YouTube is the only social platform that supports under-age use with parental supervision. The platform does this through family accounts. However, none of the minors with YouTube accounts reported being shut down due to being underage.

Shared Responsibility

Meta, which owns Instagram and Facebook said it supports age-appropriate experiences for children. However, the tech giant holds that the responsibility of enforcing age restrictions lies with app stores.

“Since the start of 2023, our proactive age detection tools have resulted in the removal of more than one million Australian users suspected of being under the age of 13,” a TikTok spokesperson said.

Commission Grant highlighted the need for social media companies to comply with online safety laws, safety expectations, minimum age requirements, and the emerging global needs.

“While social media services must do more to implement age assurance measures and prioritise the best interests of children, we cannot expect them to act alone,” she said.
She however acknowledged that this responsibility does not belong to social media platforms only.

“The responsibility for child safety, including appropriate age assurance, must be shared across the digital ecosystem, including devices and their operating systems, app stores, search engines, and other services. Parents and carers, educators, policymakers, and technology developers all have a role to play in fostering safer digital spaces,” she added.

The eSafety report appreciated that most social media platforms had conducted research on improving age assurance setups. Some had provided ways for users to report instances of underage use.

James Hughes
Scroll to Top