UK watchdogs press social media platforms
Published on
5 min read

U.K. Regulators Demand Tighter Age Verification From Social Media Firms

In Focus

  • British regulators say tech companies do not enforce their own age requirements
  • Regulators have contacted Facebook, Snapchat, and YouTube over the matter
  • Meta and TikTok say they already apply age-verification tools on their platforms

U.K. watchdogs have pressed social media platforms to tighten age-verification measures to keep children safe online. According to CNA, the British regulators warned that many companies are failing to properly enforce their own minimum age requirements.

The request came from media regulator Ofcom and the Information Commissioner’s Office (ICO), which contacted platforms including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X.

Rising Concerns Over Harmful Content

The British regulators said they are increasingly concerned that algorithm-driven feeds could expose underage users to harmful or addictive content.

“These online services are household names, but they’re failing to put children’s safety at the heart of their products. That must now change quickly, or Ofcom will act,” Ofcom’s Chief Executive Officer, Melanie Dawes noted.

Social media platforms claim that they already have safeguards in place. Google, which owns YouTube, said it was surprised by the approach taken by Ofcom and urged the regulator to focus on higher-risk services instead.

Early this year, YouTube introduced new features to expand parental control over teen viewing patterns by limiting how much time children are exposed to content on the platform. However, Ofcom and the ICO argue that social media platforms must strengthen efforts to stop children under 13 from signing up.

Regulators Want Tech Firms to Tighten Age Checks

As part of implementing the U.K. Online Safety Act, Ofcom has advised leading tech platforms, including Roblox and Snapchat, to improve their social media child safety rules by April 30, 2026. British regulators want these platforms to explain how they will tighten age checks, limit contact between children and strangers, and make algorithmic feeds safer.

In a separate communication, the ICO asked social media platforms to use modern and viable age-assurance tools to keep under-13 users from signing up for their services. Currently, most social platforms depend on users to self-report their age.

As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them. There’s now modern technology at your fingertips, so there is no excuse,” ICO Chief Executive, Paul Arnold noted.

Response from Social Media Firms

Responding to the U.K’s push on child safety regulations, Meta and TikTok said they have underage protection measures in place. Meta said it uses “AI to detect users’ age based on their activity, and facial age estimation technology.”

A spokesperson from the social media giant emphasized the need to verify age “centrally at the app store level” to keep users from providing personal information multiple times. Snapchat, which recently settled a social media addiction lawsuit in the U.S., is in the process of testing age verification tools.

TikTok said it was applying enhanced technologies to detect and deactivate underage accounts. TikTok announced plans to deploy an age-detection tool in Europe in January 2026 following regulatory pressure over how the platform identifies and blocks accounts belonging to under-13 users.

Michael Hill
Scroll to Top